Feb 17 16:37:25 localhost kernel: Linux version 5.14.0-681.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Wed Feb 11 20:19:22 UTC 2026
Feb 17 16:37:25 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Feb 17 16:37:25 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-681.el9.x86_64 root=UUID=9d578f93-c4e9-4172-8459-ef150e54751c ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 17 16:37:25 localhost kernel: BIOS-provided physical RAM map:
Feb 17 16:37:25 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Feb 17 16:37:25 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Feb 17 16:37:25 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Feb 17 16:37:25 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Feb 17 16:37:25 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Feb 17 16:37:25 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Feb 17 16:37:25 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Feb 17 16:37:25 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Feb 17 16:37:25 localhost kernel: NX (Execute Disable) protection: active
Feb 17 16:37:25 localhost kernel: APIC: Static calls initialized
Feb 17 16:37:25 localhost kernel: SMBIOS 2.8 present.
Feb 17 16:37:25 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Feb 17 16:37:25 localhost kernel: Hypervisor detected: KVM
Feb 17 16:37:25 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Feb 17 16:37:25 localhost kernel: kvm-clock: using sched offset of 8994366371 cycles
Feb 17 16:37:25 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Feb 17 16:37:25 localhost kernel: tsc: Detected 2799.998 MHz processor
Feb 17 16:37:25 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Feb 17 16:37:25 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Feb 17 16:37:25 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Feb 17 16:37:25 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Feb 17 16:37:25 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Feb 17 16:37:25 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Feb 17 16:37:25 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Feb 17 16:37:25 localhost kernel: Using GB pages for direct mapping
Feb 17 16:37:25 localhost kernel: RAMDISK: [mem 0x1b6f6000-0x29b72fff]
Feb 17 16:37:25 localhost kernel: ACPI: Early table checksum verification disabled
Feb 17 16:37:25 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Feb 17 16:37:25 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 17 16:37:25 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 17 16:37:25 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 17 16:37:25 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Feb 17 16:37:25 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 17 16:37:25 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 17 16:37:25 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Feb 17 16:37:25 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Feb 17 16:37:25 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Feb 17 16:37:25 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Feb 17 16:37:25 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Feb 17 16:37:25 localhost kernel: No NUMA configuration found
Feb 17 16:37:25 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Feb 17 16:37:25 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Feb 17 16:37:25 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Feb 17 16:37:25 localhost kernel: Zone ranges:
Feb 17 16:37:25 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Feb 17 16:37:25 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Feb 17 16:37:25 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Feb 17 16:37:25 localhost kernel:   Device   empty
Feb 17 16:37:25 localhost kernel: Movable zone start for each node
Feb 17 16:37:25 localhost kernel: Early memory node ranges
Feb 17 16:37:25 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Feb 17 16:37:25 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Feb 17 16:37:25 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Feb 17 16:37:25 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Feb 17 16:37:25 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Feb 17 16:37:25 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Feb 17 16:37:25 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Feb 17 16:37:25 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Feb 17 16:37:25 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Feb 17 16:37:25 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Feb 17 16:37:25 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Feb 17 16:37:25 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Feb 17 16:37:25 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Feb 17 16:37:25 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Feb 17 16:37:25 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Feb 17 16:37:25 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Feb 17 16:37:25 localhost kernel: TSC deadline timer available
Feb 17 16:37:25 localhost kernel: CPU topo: Max. logical packages:   8
Feb 17 16:37:25 localhost kernel: CPU topo: Max. logical dies:       8
Feb 17 16:37:25 localhost kernel: CPU topo: Max. dies per package:   1
Feb 17 16:37:25 localhost kernel: CPU topo: Max. threads per core:   1
Feb 17 16:37:25 localhost kernel: CPU topo: Num. cores per package:     1
Feb 17 16:37:25 localhost kernel: CPU topo: Num. threads per package:   1
Feb 17 16:37:25 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Feb 17 16:37:25 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Feb 17 16:37:25 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Feb 17 16:37:25 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Feb 17 16:37:25 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Feb 17 16:37:25 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Feb 17 16:37:25 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Feb 17 16:37:25 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Feb 17 16:37:25 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Feb 17 16:37:25 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Feb 17 16:37:25 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Feb 17 16:37:25 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Feb 17 16:37:25 localhost kernel: Booting paravirtualized kernel on KVM
Feb 17 16:37:25 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Feb 17 16:37:25 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Feb 17 16:37:25 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Feb 17 16:37:25 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Feb 17 16:37:25 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Feb 17 16:37:25 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Feb 17 16:37:25 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-681.el9.x86_64 root=UUID=9d578f93-c4e9-4172-8459-ef150e54751c ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 17 16:37:25 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-681.el9.x86_64", will be passed to user space.
Feb 17 16:37:25 localhost kernel: random: crng init done
Feb 17 16:37:25 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Feb 17 16:37:25 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Feb 17 16:37:25 localhost kernel: Fallback order for Node 0: 0 
Feb 17 16:37:25 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Feb 17 16:37:25 localhost kernel: Policy zone: Normal
Feb 17 16:37:25 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Feb 17 16:37:25 localhost kernel: software IO TLB: area num 8.
Feb 17 16:37:25 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Feb 17 16:37:25 localhost kernel: ftrace: allocating 49565 entries in 194 pages
Feb 17 16:37:25 localhost kernel: ftrace: allocated 194 pages with 3 groups
Feb 17 16:37:25 localhost kernel: Dynamic Preempt: voluntary
Feb 17 16:37:25 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Feb 17 16:37:25 localhost kernel: rcu:         RCU event tracing is enabled.
Feb 17 16:37:25 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Feb 17 16:37:25 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Feb 17 16:37:25 localhost kernel:         Rude variant of Tasks RCU enabled.
Feb 17 16:37:25 localhost kernel:         Tracing variant of Tasks RCU enabled.
Feb 17 16:37:25 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Feb 17 16:37:25 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Feb 17 16:37:25 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 17 16:37:25 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 17 16:37:25 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 17 16:37:25 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Feb 17 16:37:25 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Feb 17 16:37:25 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Feb 17 16:37:25 localhost kernel: Console: colour VGA+ 80x25
Feb 17 16:37:25 localhost kernel: printk: console [ttyS0] enabled
Feb 17 16:37:25 localhost kernel: ACPI: Core revision 20230331
Feb 17 16:37:25 localhost kernel: APIC: Switch to symmetric I/O mode setup
Feb 17 16:37:25 localhost kernel: x2apic enabled
Feb 17 16:37:25 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Feb 17 16:37:25 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Feb 17 16:37:25 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Feb 17 16:37:25 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Feb 17 16:37:25 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Feb 17 16:37:25 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Feb 17 16:37:25 localhost kernel: mitigations: Enabled attack vectors: user_kernel, user_user, guest_host, guest_guest, SMT mitigations: auto
Feb 17 16:37:25 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Feb 17 16:37:25 localhost kernel: Spectre V2 : Mitigation: Retpolines
Feb 17 16:37:25 localhost kernel: RETBleed: Mitigation: untrained return thunk
Feb 17 16:37:25 localhost kernel: Speculative Return Stack Overflow: Mitigation: SMT disabled
Feb 17 16:37:25 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Feb 17 16:37:25 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Feb 17 16:37:25 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Feb 17 16:37:25 localhost kernel: active return thunk: retbleed_return_thunk
Feb 17 16:37:25 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Feb 17 16:37:25 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Feb 17 16:37:25 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Feb 17 16:37:25 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Feb 17 16:37:25 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Feb 17 16:37:25 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Feb 17 16:37:25 localhost kernel: Freeing SMP alternatives memory: 40K
Feb 17 16:37:25 localhost kernel: pid_max: default: 32768 minimum: 301
Feb 17 16:37:25 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Feb 17 16:37:25 localhost kernel: landlock: Up and running.
Feb 17 16:37:25 localhost kernel: Yama: becoming mindful.
Feb 17 16:37:25 localhost kernel: SELinux:  Initializing.
Feb 17 16:37:25 localhost kernel: LSM support for eBPF active
Feb 17 16:37:25 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Feb 17 16:37:25 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Feb 17 16:37:25 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Feb 17 16:37:25 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Feb 17 16:37:25 localhost kernel: ... version:                0
Feb 17 16:37:25 localhost kernel: ... bit width:              48
Feb 17 16:37:25 localhost kernel: ... generic registers:      6
Feb 17 16:37:25 localhost kernel: ... value mask:             0000ffffffffffff
Feb 17 16:37:25 localhost kernel: ... max period:             00007fffffffffff
Feb 17 16:37:25 localhost kernel: ... fixed-purpose events:   0
Feb 17 16:37:25 localhost kernel: ... event mask:             000000000000003f
Feb 17 16:37:25 localhost kernel: signal: max sigframe size: 1776
Feb 17 16:37:25 localhost kernel: rcu: Hierarchical SRCU implementation.
Feb 17 16:37:25 localhost kernel: rcu:         Max phase no-delay instances is 400.
Feb 17 16:37:25 localhost kernel: smp: Bringing up secondary CPUs ...
Feb 17 16:37:25 localhost kernel: smpboot: x86: Booting SMP configuration:
Feb 17 16:37:25 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Feb 17 16:37:25 localhost kernel: smp: Brought up 1 node, 8 CPUs
Feb 17 16:37:25 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Feb 17 16:37:25 localhost kernel: node 0 deferred pages initialised in 8ms
Feb 17 16:37:25 localhost kernel: Memory: 7617800K/8388068K available (16384K kernel code, 5795K rwdata, 13948K rodata, 4204K init, 7180K bss, 764376K reserved, 0K cma-reserved)
Feb 17 16:37:25 localhost kernel: devtmpfs: initialized
Feb 17 16:37:25 localhost kernel: x86/mm: Memory block size: 128MB
Feb 17 16:37:25 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Feb 17 16:37:25 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Feb 17 16:37:25 localhost kernel: pinctrl core: initialized pinctrl subsystem
Feb 17 16:37:25 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Feb 17 16:37:25 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Feb 17 16:37:25 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Feb 17 16:37:25 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Feb 17 16:37:25 localhost kernel: audit: initializing netlink subsys (disabled)
Feb 17 16:37:25 localhost kernel: audit: type=2000 audit(1771346243.932:1): state=initialized audit_enabled=0 res=1
Feb 17 16:37:25 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Feb 17 16:37:25 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Feb 17 16:37:25 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Feb 17 16:37:25 localhost kernel: cpuidle: using governor menu
Feb 17 16:37:25 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Feb 17 16:37:25 localhost kernel: PCI: Using configuration type 1 for base access
Feb 17 16:37:25 localhost kernel: PCI: Using configuration type 1 for extended access
Feb 17 16:37:25 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Feb 17 16:37:25 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Feb 17 16:37:25 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Feb 17 16:37:25 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Feb 17 16:37:25 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Feb 17 16:37:25 localhost kernel: Demotion targets for Node 0: null
Feb 17 16:37:25 localhost kernel: cryptd: max_cpu_qlen set to 1000
Feb 17 16:37:25 localhost kernel: ACPI: Added _OSI(Module Device)
Feb 17 16:37:25 localhost kernel: ACPI: Added _OSI(Processor Device)
Feb 17 16:37:25 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Feb 17 16:37:25 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Feb 17 16:37:25 localhost kernel: ACPI: Interpreter enabled
Feb 17 16:37:25 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Feb 17 16:37:25 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Feb 17 16:37:25 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Feb 17 16:37:25 localhost kernel: PCI: Using E820 reservations for host bridge windows
Feb 17 16:37:25 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Feb 17 16:37:25 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Feb 17 16:37:25 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [3] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [4] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [5] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [6] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [7] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [8] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [9] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [10] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [11] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [12] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [13] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [14] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [15] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [16] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [17] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [18] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [19] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [20] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [21] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [22] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [23] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [24] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [25] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [26] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [27] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [28] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [29] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [30] registered
Feb 17 16:37:25 localhost kernel: acpiphp: Slot [31] registered
Feb 17 16:37:25 localhost kernel: PCI host bridge to bus 0000:00
Feb 17 16:37:25 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Feb 17 16:37:25 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Feb 17 16:37:25 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Feb 17 16:37:25 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Feb 17 16:37:25 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Feb 17 16:37:25 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Feb 17 16:37:25 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Feb 17 16:37:25 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Feb 17 16:37:25 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Feb 17 16:37:25 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Feb 17 16:37:25 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Feb 17 16:37:25 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Feb 17 16:37:25 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Feb 17 16:37:25 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Feb 17 16:37:25 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Feb 17 16:37:25 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Feb 17 16:37:25 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Feb 17 16:37:25 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Feb 17 16:37:25 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Feb 17 16:37:25 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Feb 17 16:37:25 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Feb 17 16:37:25 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Feb 17 16:37:25 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Feb 17 16:37:25 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Feb 17 16:37:25 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Feb 17 16:37:25 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Feb 17 16:37:25 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Feb 17 16:37:25 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Feb 17 16:37:25 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Feb 17 16:37:25 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Feb 17 16:37:25 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Feb 17 16:37:25 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Feb 17 16:37:25 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Feb 17 16:37:25 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Feb 17 16:37:25 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Feb 17 16:37:25 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Feb 17 16:37:25 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Feb 17 16:37:25 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Feb 17 16:37:25 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Feb 17 16:37:25 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Feb 17 16:37:25 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Feb 17 16:37:25 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Feb 17 16:37:25 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Feb 17 16:37:25 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Feb 17 16:37:25 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Feb 17 16:37:25 localhost kernel: iommu: Default domain type: Translated
Feb 17 16:37:25 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Feb 17 16:37:25 localhost kernel: SCSI subsystem initialized
Feb 17 16:37:25 localhost kernel: ACPI: bus type USB registered
Feb 17 16:37:25 localhost kernel: usbcore: registered new interface driver usbfs
Feb 17 16:37:25 localhost kernel: usbcore: registered new interface driver hub
Feb 17 16:37:25 localhost kernel: usbcore: registered new device driver usb
Feb 17 16:37:25 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Feb 17 16:37:25 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Feb 17 16:37:25 localhost kernel: PTP clock support registered
Feb 17 16:37:25 localhost kernel: EDAC MC: Ver: 3.0.0
Feb 17 16:37:25 localhost kernel: NetLabel: Initializing
Feb 17 16:37:25 localhost kernel: NetLabel:  domain hash size = 128
Feb 17 16:37:25 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Feb 17 16:37:25 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Feb 17 16:37:25 localhost kernel: PCI: Using ACPI for IRQ routing
Feb 17 16:37:25 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Feb 17 16:37:25 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Feb 17 16:37:25 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Feb 17 16:37:25 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Feb 17 16:37:25 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Feb 17 16:37:25 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Feb 17 16:37:25 localhost kernel: vgaarb: loaded
Feb 17 16:37:25 localhost kernel: clocksource: Switched to clocksource kvm-clock
Feb 17 16:37:25 localhost kernel: VFS: Disk quotas dquot_6.6.0
Feb 17 16:37:25 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Feb 17 16:37:25 localhost kernel: pnp: PnP ACPI init
Feb 17 16:37:25 localhost kernel: pnp 00:03: [dma 2]
Feb 17 16:37:25 localhost kernel: pnp: PnP ACPI: found 5 devices
Feb 17 16:37:25 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Feb 17 16:37:25 localhost kernel: NET: Registered PF_INET protocol family
Feb 17 16:37:25 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Feb 17 16:37:25 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Feb 17 16:37:25 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Feb 17 16:37:25 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Feb 17 16:37:25 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Feb 17 16:37:25 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Feb 17 16:37:25 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Feb 17 16:37:25 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Feb 17 16:37:25 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Feb 17 16:37:25 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Feb 17 16:37:25 localhost kernel: NET: Registered PF_XDP protocol family
Feb 17 16:37:25 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Feb 17 16:37:25 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Feb 17 16:37:25 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Feb 17 16:37:25 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Feb 17 16:37:25 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Feb 17 16:37:25 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Feb 17 16:37:25 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Feb 17 16:37:25 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Feb 17 16:37:25 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 22149 usecs
Feb 17 16:37:25 localhost kernel: PCI: CLS 0 bytes, default 64
Feb 17 16:37:25 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Feb 17 16:37:25 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Feb 17 16:37:25 localhost kernel: ACPI: bus type thunderbolt registered
Feb 17 16:37:25 localhost kernel: Trying to unpack rootfs image as initramfs...
Feb 17 16:37:25 localhost kernel: Initialise system trusted keyrings
Feb 17 16:37:25 localhost kernel: Key type blacklist registered
Feb 17 16:37:25 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Feb 17 16:37:25 localhost kernel: zbud: loaded
Feb 17 16:37:25 localhost kernel: integrity: Platform Keyring initialized
Feb 17 16:37:25 localhost kernel: integrity: Machine keyring initialized
Feb 17 16:37:25 localhost kernel: Freeing initrd memory: 233972K
Feb 17 16:37:25 localhost kernel: NET: Registered PF_ALG protocol family
Feb 17 16:37:25 localhost kernel: xor: automatically using best checksumming function   avx       
Feb 17 16:37:25 localhost kernel: Key type asymmetric registered
Feb 17 16:37:25 localhost kernel: Asymmetric key parser 'x509' registered
Feb 17 16:37:25 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Feb 17 16:37:25 localhost kernel: io scheduler mq-deadline registered
Feb 17 16:37:25 localhost kernel: io scheduler kyber registered
Feb 17 16:37:25 localhost kernel: io scheduler bfq registered
Feb 17 16:37:25 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Feb 17 16:37:25 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Feb 17 16:37:25 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Feb 17 16:37:25 localhost kernel: ACPI: button: Power Button [PWRF]
Feb 17 16:37:25 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Feb 17 16:37:25 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Feb 17 16:37:25 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Feb 17 16:37:25 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Feb 17 16:37:25 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Feb 17 16:37:25 localhost kernel: Non-volatile memory driver v1.3
Feb 17 16:37:25 localhost kernel: rdac: device handler registered
Feb 17 16:37:25 localhost kernel: hp_sw: device handler registered
Feb 17 16:37:25 localhost kernel: emc: device handler registered
Feb 17 16:37:25 localhost kernel: alua: device handler registered
Feb 17 16:37:25 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Feb 17 16:37:25 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Feb 17 16:37:25 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Feb 17 16:37:25 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Feb 17 16:37:25 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Feb 17 16:37:25 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Feb 17 16:37:25 localhost kernel: usb usb1: Product: UHCI Host Controller
Feb 17 16:37:25 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-681.el9.x86_64 uhci_hcd
Feb 17 16:37:25 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Feb 17 16:37:25 localhost kernel: hub 1-0:1.0: USB hub found
Feb 17 16:37:25 localhost kernel: hub 1-0:1.0: 2 ports detected
Feb 17 16:37:25 localhost kernel: usbcore: registered new interface driver usbserial_generic
Feb 17 16:37:25 localhost kernel: usbserial: USB Serial support registered for generic
Feb 17 16:37:25 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Feb 17 16:37:25 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Feb 17 16:37:25 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Feb 17 16:37:25 localhost kernel: mousedev: PS/2 mouse device common for all mice
Feb 17 16:37:25 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Feb 17 16:37:25 localhost kernel: rtc_cmos 00:04: registered as rtc0
Feb 17 16:37:25 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-02-17T16:37:24 UTC (1771346244)
Feb 17 16:37:25 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Feb 17 16:37:25 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Feb 17 16:37:25 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Feb 17 16:37:25 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Feb 17 16:37:25 localhost kernel: usbcore: registered new interface driver usbhid
Feb 17 16:37:25 localhost kernel: usbhid: USB HID core driver
Feb 17 16:37:25 localhost kernel: drop_monitor: Initializing network drop monitor service
Feb 17 16:37:25 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Feb 17 16:37:25 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Feb 17 16:37:25 localhost kernel: Initializing XFRM netlink socket
Feb 17 16:37:25 localhost kernel: NET: Registered PF_INET6 protocol family
Feb 17 16:37:25 localhost kernel: Segment Routing with IPv6
Feb 17 16:37:25 localhost kernel: NET: Registered PF_PACKET protocol family
Feb 17 16:37:25 localhost kernel: mpls_gso: MPLS GSO support
Feb 17 16:37:25 localhost kernel: IPI shorthand broadcast: enabled
Feb 17 16:37:25 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Feb 17 16:37:25 localhost kernel: AES CTR mode by8 optimization enabled
Feb 17 16:37:25 localhost kernel: sched_clock: Marking stable (1028004567, 142892502)->(1279867544, -108970475)
Feb 17 16:37:25 localhost kernel: registered taskstats version 1
Feb 17 16:37:25 localhost kernel: Loading compiled-in X.509 certificates
Feb 17 16:37:25 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4fde1469d8033882223ec575c1a1c1b88c9a497b'
Feb 17 16:37:25 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Feb 17 16:37:25 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Feb 17 16:37:25 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Feb 17 16:37:25 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Feb 17 16:37:25 localhost kernel: Demotion targets for Node 0: null
Feb 17 16:37:25 localhost kernel: page_owner is disabled
Feb 17 16:37:25 localhost kernel: Key type .fscrypt registered
Feb 17 16:37:25 localhost kernel: Key type fscrypt-provisioning registered
Feb 17 16:37:25 localhost kernel: Key type big_key registered
Feb 17 16:37:25 localhost kernel: Key type encrypted registered
Feb 17 16:37:25 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Feb 17 16:37:25 localhost kernel: Loading compiled-in module X.509 certificates
Feb 17 16:37:25 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4fde1469d8033882223ec575c1a1c1b88c9a497b'
Feb 17 16:37:25 localhost kernel: ima: Allocated hash algorithm: sha256
Feb 17 16:37:25 localhost kernel: ima: No architecture policies found
Feb 17 16:37:25 localhost kernel: evm: Initialising EVM extended attributes:
Feb 17 16:37:25 localhost kernel: evm: security.selinux
Feb 17 16:37:25 localhost kernel: evm: security.SMACK64 (disabled)
Feb 17 16:37:25 localhost kernel: evm: security.SMACK64EXEC (disabled)
Feb 17 16:37:25 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Feb 17 16:37:25 localhost kernel: evm: security.SMACK64MMAP (disabled)
Feb 17 16:37:25 localhost kernel: evm: security.apparmor (disabled)
Feb 17 16:37:25 localhost kernel: evm: security.ima
Feb 17 16:37:25 localhost kernel: evm: security.capability
Feb 17 16:37:25 localhost kernel: evm: HMAC attrs: 0x1
Feb 17 16:37:25 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Feb 17 16:37:25 localhost kernel: Running certificate verification RSA selftest
Feb 17 16:37:25 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Feb 17 16:37:25 localhost kernel: Running certificate verification ECDSA selftest
Feb 17 16:37:25 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Feb 17 16:37:25 localhost kernel: clk: Disabling unused clocks
Feb 17 16:37:25 localhost kernel: Freeing unused decrypted memory: 2028K
Feb 17 16:37:25 localhost kernel: Freeing unused kernel image (initmem) memory: 4204K
Feb 17 16:37:25 localhost kernel: Write protecting the kernel read-only data: 30720k
Feb 17 16:37:25 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 388K
Feb 17 16:37:25 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Feb 17 16:37:25 localhost kernel: Run /init as init process
Feb 17 16:37:25 localhost kernel:   with arguments:
Feb 17 16:37:25 localhost kernel:     /init
Feb 17 16:37:25 localhost kernel:   with environment:
Feb 17 16:37:25 localhost kernel:     HOME=/
Feb 17 16:37:25 localhost kernel:     TERM=linux
Feb 17 16:37:25 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-681.el9.x86_64
Feb 17 16:37:25 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 17 16:37:25 localhost systemd[1]: Detected virtualization kvm.
Feb 17 16:37:25 localhost systemd[1]: Detected architecture x86-64.
Feb 17 16:37:25 localhost systemd[1]: Running in initrd.
Feb 17 16:37:25 localhost systemd[1]: No hostname configured, using default hostname.
Feb 17 16:37:25 localhost systemd[1]: Hostname set to <localhost>.
Feb 17 16:37:25 localhost systemd[1]: Initializing machine ID from VM UUID.
Feb 17 16:37:25 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Feb 17 16:37:25 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Feb 17 16:37:25 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Feb 17 16:37:25 localhost kernel: usb 1-1: Manufacturer: QEMU
Feb 17 16:37:25 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Feb 17 16:37:25 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Feb 17 16:37:25 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Feb 17 16:37:25 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Feb 17 16:37:25 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 17 16:37:25 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 17 16:37:25 localhost systemd[1]: Reached target Initrd /usr File System.
Feb 17 16:37:25 localhost systemd[1]: Reached target Local File Systems.
Feb 17 16:37:25 localhost systemd[1]: Reached target Path Units.
Feb 17 16:37:25 localhost systemd[1]: Reached target Slice Units.
Feb 17 16:37:25 localhost systemd[1]: Reached target Swaps.
Feb 17 16:37:25 localhost systemd[1]: Reached target Timer Units.
Feb 17 16:37:25 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 17 16:37:25 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Feb 17 16:37:25 localhost systemd[1]: Listening on Journal Socket.
Feb 17 16:37:25 localhost systemd[1]: Listening on udev Control Socket.
Feb 17 16:37:25 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 17 16:37:25 localhost systemd[1]: Reached target Socket Units.
Feb 17 16:37:25 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 17 16:37:25 localhost systemd[1]: Starting Journal Service...
Feb 17 16:37:25 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Feb 17 16:37:25 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 17 16:37:25 localhost systemd[1]: Starting Create System Users...
Feb 17 16:37:25 localhost systemd[1]: Starting Setup Virtual Console...
Feb 17 16:37:25 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 17 16:37:25 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 17 16:37:25 localhost systemd[1]: Finished Create System Users.
Feb 17 16:37:25 localhost systemd-journald[305]: Journal started
Feb 17 16:37:25 localhost systemd-journald[305]: Runtime Journal (/run/log/journal/a56dc234e7cf4362a05f7ac9718f0a67) is 8.0M, max 153.6M, 145.6M free.
Feb 17 16:37:25 localhost systemd-sysusers[310]: Creating group 'users' with GID 100.
Feb 17 16:37:25 localhost systemd-sysusers[310]: Creating group 'dbus' with GID 81.
Feb 17 16:37:25 localhost systemd-sysusers[310]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Feb 17 16:37:25 localhost systemd[1]: Started Journal Service.
Feb 17 16:37:25 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 17 16:37:25 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 17 16:37:25 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 17 16:37:25 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 17 16:37:25 localhost systemd[1]: Finished Setup Virtual Console.
Feb 17 16:37:25 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Feb 17 16:37:25 localhost systemd[1]: Starting dracut cmdline hook...
Feb 17 16:37:25 localhost dracut-cmdline[324]: dracut-9 dracut-057-110.git20260130.el9
Feb 17 16:37:25 localhost dracut-cmdline[324]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-681.el9.x86_64 root=UUID=9d578f93-c4e9-4172-8459-ef150e54751c ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 17 16:37:25 localhost systemd[1]: Finished dracut cmdline hook.
Feb 17 16:37:25 localhost systemd[1]: Starting dracut pre-udev hook...
Feb 17 16:37:25 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Feb 17 16:37:25 localhost kernel: device-mapper: uevent: version 1.0.3
Feb 17 16:37:25 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Feb 17 16:37:25 localhost kernel: RPC: Registered named UNIX socket transport module.
Feb 17 16:37:25 localhost kernel: RPC: Registered udp transport module.
Feb 17 16:37:25 localhost kernel: RPC: Registered tcp transport module.
Feb 17 16:37:25 localhost kernel: RPC: Registered tcp-with-tls transport module.
Feb 17 16:37:25 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Feb 17 16:37:25 localhost rpc.statd[440]: Version 2.5.4 starting
Feb 17 16:37:25 localhost rpc.statd[440]: Initializing NSM state
Feb 17 16:37:25 localhost rpc.idmapd[445]: Setting log level to 0
Feb 17 16:37:25 localhost systemd[1]: Finished dracut pre-udev hook.
Feb 17 16:37:25 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 17 16:37:25 localhost systemd-udevd[458]: Using default interface naming scheme 'rhel-9.0'.
Feb 17 16:37:25 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 17 16:37:25 localhost systemd[1]: Starting dracut pre-trigger hook...
Feb 17 16:37:25 localhost systemd[1]: Finished dracut pre-trigger hook.
Feb 17 16:37:25 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 17 16:37:25 localhost systemd[1]: Created slice Slice /system/modprobe.
Feb 17 16:37:25 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 17 16:37:25 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 17 16:37:25 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 17 16:37:25 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 17 16:37:25 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 17 16:37:25 localhost systemd[1]: Reached target Network.
Feb 17 16:37:25 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 17 16:37:25 localhost systemd[1]: Starting dracut initqueue hook...
Feb 17 16:37:25 localhost kernel: libata version 3.00 loaded.
Feb 17 16:37:25 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Feb 17 16:37:25 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Feb 17 16:37:25 localhost kernel: scsi host0: ata_piix
Feb 17 16:37:25 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Feb 17 16:37:25 localhost kernel: scsi host1: ata_piix
Feb 17 16:37:25 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Feb 17 16:37:25 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Feb 17 16:37:25 localhost systemd-udevd[493]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 16:37:25 localhost kernel:  vda: vda1
Feb 17 16:37:25 localhost kernel: ACPI: bus type drm_connector registered
Feb 17 16:37:26 localhost systemd[1]: Mounting Kernel Configuration File System...
Feb 17 16:37:26 localhost kernel: ata1: found unknown device (class 0)
Feb 17 16:37:26 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Feb 17 16:37:26 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Feb 17 16:37:26 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Feb 17 16:37:26 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Feb 17 16:37:26 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Feb 17 16:37:26 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Feb 17 16:37:26 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Feb 17 16:37:26 localhost systemd[1]: Mounted Kernel Configuration File System.
Feb 17 16:37:26 localhost kernel: Console: switching to colour dummy device 80x25
Feb 17 16:37:26 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Feb 17 16:37:26 localhost kernel: [drm] features: -context_init
Feb 17 16:37:26 localhost kernel: [drm] number of scanouts: 1
Feb 17 16:37:26 localhost kernel: [drm] number of cap sets: 0
Feb 17 16:37:26 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Feb 17 16:37:26 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Feb 17 16:37:26 localhost kernel: Console: switching to colour frame buffer device 128x48
Feb 17 16:37:26 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Feb 17 16:37:26 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Feb 17 16:37:26 localhost systemd[1]: Found device /dev/disk/by-uuid/9d578f93-c4e9-4172-8459-ef150e54751c.
Feb 17 16:37:26 localhost systemd[1]: Reached target Initrd Root Device.
Feb 17 16:37:26 localhost systemd[1]: Reached target System Initialization.
Feb 17 16:37:26 localhost systemd[1]: Reached target Basic System.
Feb 17 16:37:26 localhost systemd[1]: Finished dracut initqueue hook.
Feb 17 16:37:26 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Feb 17 16:37:26 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Feb 17 16:37:26 localhost systemd[1]: Reached target Remote File Systems.
Feb 17 16:37:26 localhost systemd[1]: Starting dracut pre-mount hook...
Feb 17 16:37:26 localhost systemd[1]: Finished dracut pre-mount hook.
Feb 17 16:37:26 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/9d578f93-c4e9-4172-8459-ef150e54751c...
Feb 17 16:37:26 localhost systemd-fsck[562]: /usr/sbin/fsck.xfs: XFS file system.
Feb 17 16:37:26 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/9d578f93-c4e9-4172-8459-ef150e54751c.
Feb 17 16:37:26 localhost systemd[1]: Mounting /sysroot...
Feb 17 16:37:26 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Feb 17 16:37:26 localhost kernel: XFS (vda1): Mounting V5 Filesystem 9d578f93-c4e9-4172-8459-ef150e54751c
Feb 17 16:37:26 localhost kernel: XFS (vda1): Ending clean mount
Feb 17 16:37:26 localhost systemd[1]: Mounted /sysroot.
Feb 17 16:37:26 localhost systemd[1]: Reached target Initrd Root File System.
Feb 17 16:37:26 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Feb 17 16:37:26 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Feb 17 16:37:26 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Feb 17 16:37:26 localhost systemd[1]: Reached target Initrd File Systems.
Feb 17 16:37:26 localhost systemd[1]: Reached target Initrd Default Target.
Feb 17 16:37:26 localhost systemd[1]: Starting dracut mount hook...
Feb 17 16:37:26 localhost systemd[1]: Finished dracut mount hook.
Feb 17 16:37:26 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Feb 17 16:37:27 localhost rpc.idmapd[445]: exiting on signal 15
Feb 17 16:37:27 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Feb 17 16:37:27 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Feb 17 16:37:27 localhost systemd[1]: Stopped target Network.
Feb 17 16:37:27 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Feb 17 16:37:27 localhost systemd[1]: Stopped target Timer Units.
Feb 17 16:37:27 localhost systemd[1]: dbus.socket: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Feb 17 16:37:27 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Feb 17 16:37:27 localhost systemd[1]: Stopped target Initrd Default Target.
Feb 17 16:37:27 localhost systemd[1]: Stopped target Basic System.
Feb 17 16:37:27 localhost systemd[1]: Stopped target Initrd Root Device.
Feb 17 16:37:27 localhost systemd[1]: Stopped target Initrd /usr File System.
Feb 17 16:37:27 localhost systemd[1]: Stopped target Path Units.
Feb 17 16:37:27 localhost systemd[1]: Stopped target Remote File Systems.
Feb 17 16:37:27 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Feb 17 16:37:27 localhost systemd[1]: Stopped target Slice Units.
Feb 17 16:37:27 localhost systemd[1]: Stopped target Socket Units.
Feb 17 16:37:27 localhost systemd[1]: Stopped target System Initialization.
Feb 17 16:37:27 localhost systemd[1]: Stopped target Local File Systems.
Feb 17 16:37:27 localhost systemd[1]: Stopped target Swaps.
Feb 17 16:37:27 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: Stopped dracut mount hook.
Feb 17 16:37:27 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: Stopped dracut pre-mount hook.
Feb 17 16:37:27 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Feb 17 16:37:27 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Feb 17 16:37:27 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: Stopped dracut initqueue hook.
Feb 17 16:37:27 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: Stopped Apply Kernel Variables.
Feb 17 16:37:27 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Feb 17 16:37:27 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: Stopped Coldplug All udev Devices.
Feb 17 16:37:27 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: Stopped dracut pre-trigger hook.
Feb 17 16:37:27 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Feb 17 16:37:27 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: Stopped Setup Virtual Console.
Feb 17 16:37:27 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Feb 17 16:37:27 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Feb 17 16:37:27 localhost systemd[1]: systemd-udevd.service: Consumed 1.176s CPU time.
Feb 17 16:37:27 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: Closed udev Control Socket.
Feb 17 16:37:27 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: Closed udev Kernel Socket.
Feb 17 16:37:27 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: Stopped dracut pre-udev hook.
Feb 17 16:37:27 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: Stopped dracut cmdline hook.
Feb 17 16:37:27 localhost systemd[1]: Starting Cleanup udev Database...
Feb 17 16:37:27 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Feb 17 16:37:27 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Feb 17 16:37:27 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: Stopped Create System Users.
Feb 17 16:37:27 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Feb 17 16:37:27 localhost systemd[1]: Finished Cleanup udev Database.
Feb 17 16:37:27 localhost systemd[1]: Reached target Switch Root.
Feb 17 16:37:27 localhost systemd[1]: Starting Switch Root...
Feb 17 16:37:27 localhost systemd[1]: Switching root.
Feb 17 16:37:27 localhost systemd-journald[305]: Journal stopped
Feb 17 16:37:28 localhost systemd-journald[305]: Received SIGTERM from PID 1 (systemd).
Feb 17 16:37:28 localhost kernel: audit: type=1404 audit(1771346247.421:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Feb 17 16:37:28 localhost kernel: SELinux:  policy capability network_peer_controls=1
Feb 17 16:37:28 localhost kernel: SELinux:  policy capability open_perms=1
Feb 17 16:37:28 localhost kernel: SELinux:  policy capability extended_socket_class=1
Feb 17 16:37:28 localhost kernel: SELinux:  policy capability always_check_network=0
Feb 17 16:37:28 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 17 16:37:28 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 17 16:37:28 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 17 16:37:28 localhost kernel: audit: type=1403 audit(1771346247.539:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Feb 17 16:37:28 localhost systemd[1]: Successfully loaded SELinux policy in 123.913ms.
Feb 17 16:37:28 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 40.173ms.
Feb 17 16:37:28 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 17 16:37:28 localhost systemd[1]: Detected virtualization kvm.
Feb 17 16:37:28 localhost systemd[1]: Detected architecture x86-64.
Feb 17 16:37:28 localhost systemd-rc-local-generator[645]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 16:37:28 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Feb 17 16:37:28 localhost systemd[1]: Stopped Switch Root.
Feb 17 16:37:28 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Feb 17 16:37:28 localhost systemd[1]: Created slice Slice /system/getty.
Feb 17 16:37:28 localhost systemd[1]: Created slice Slice /system/serial-getty.
Feb 17 16:37:28 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Feb 17 16:37:28 localhost systemd[1]: Created slice User and Session Slice.
Feb 17 16:37:28 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 17 16:37:28 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Feb 17 16:37:28 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Feb 17 16:37:28 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 17 16:37:28 localhost systemd[1]: Stopped target Switch Root.
Feb 17 16:37:28 localhost systemd[1]: Stopped target Initrd File Systems.
Feb 17 16:37:28 localhost systemd[1]: Stopped target Initrd Root File System.
Feb 17 16:37:28 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Feb 17 16:37:28 localhost systemd[1]: Reached target Path Units.
Feb 17 16:37:28 localhost systemd[1]: Reached target rpc_pipefs.target.
Feb 17 16:37:28 localhost systemd[1]: Reached target Slice Units.
Feb 17 16:37:28 localhost systemd[1]: Reached target Swaps.
Feb 17 16:37:28 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Feb 17 16:37:28 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Feb 17 16:37:28 localhost systemd[1]: Reached target RPC Port Mapper.
Feb 17 16:37:28 localhost systemd[1]: Listening on Process Core Dump Socket.
Feb 17 16:37:28 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Feb 17 16:37:28 localhost systemd[1]: Listening on udev Control Socket.
Feb 17 16:37:28 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 17 16:37:28 localhost systemd[1]: Mounting Huge Pages File System...
Feb 17 16:37:28 localhost systemd[1]: Mounting POSIX Message Queue File System...
Feb 17 16:37:28 localhost systemd[1]: Mounting Kernel Debug File System...
Feb 17 16:37:28 localhost systemd[1]: Mounting Kernel Trace File System...
Feb 17 16:37:28 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 17 16:37:28 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 17 16:37:28 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 17 16:37:28 localhost systemd[1]: Starting Load Kernel Module drm...
Feb 17 16:37:28 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Feb 17 16:37:28 localhost systemd[1]: Starting Load Kernel Module fuse...
Feb 17 16:37:28 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Feb 17 16:37:28 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Feb 17 16:37:28 localhost systemd[1]: Stopped File System Check on Root Device.
Feb 17 16:37:28 localhost systemd[1]: Stopped Journal Service.
Feb 17 16:37:28 localhost kernel: fuse: init (API version 7.37)
Feb 17 16:37:28 localhost systemd[1]: Starting Journal Service...
Feb 17 16:37:28 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Feb 17 16:37:28 localhost systemd[1]: Starting Generate network units from Kernel command line...
Feb 17 16:37:28 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 17 16:37:28 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Feb 17 16:37:28 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Feb 17 16:37:28 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 17 16:37:28 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Feb 17 16:37:28 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 17 16:37:28 localhost systemd[1]: Mounted Huge Pages File System.
Feb 17 16:37:28 localhost systemd[1]: Mounted POSIX Message Queue File System.
Feb 17 16:37:28 localhost systemd[1]: Mounted Kernel Debug File System.
Feb 17 16:37:28 localhost systemd[1]: Mounted Kernel Trace File System.
Feb 17 16:37:28 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 17 16:37:28 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 17 16:37:28 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 17 16:37:28 localhost systemd-journald[693]: Journal started
Feb 17 16:37:28 localhost systemd-journald[693]: Runtime Journal (/run/log/journal/621707288f5710e1387f73ed6d90e964) is 8.0M, max 153.6M, 145.6M free.
Feb 17 16:37:27 localhost systemd[1]: Queued start job for default target Multi-User System.
Feb 17 16:37:27 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Feb 17 16:37:28 localhost systemd[1]: Started Journal Service.
Feb 17 16:37:28 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Feb 17 16:37:28 localhost systemd[1]: Finished Load Kernel Module drm.
Feb 17 16:37:28 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Feb 17 16:37:28 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Feb 17 16:37:28 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Feb 17 16:37:28 localhost systemd[1]: Finished Load Kernel Module fuse.
Feb 17 16:37:28 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Feb 17 16:37:28 localhost systemd[1]: Finished Generate network units from Kernel command line.
Feb 17 16:37:28 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Feb 17 16:37:28 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 17 16:37:28 localhost systemd[1]: Mounting FUSE Control File System...
Feb 17 16:37:28 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 17 16:37:28 localhost systemd[1]: Starting Rebuild Hardware Database...
Feb 17 16:37:28 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Feb 17 16:37:28 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Feb 17 16:37:28 localhost systemd[1]: Starting Load/Save OS Random Seed...
Feb 17 16:37:28 localhost systemd[1]: Starting Create System Users...
Feb 17 16:37:28 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 17 16:37:28 localhost systemd[1]: Mounted FUSE Control File System.
Feb 17 16:37:28 localhost systemd-journald[693]: Runtime Journal (/run/log/journal/621707288f5710e1387f73ed6d90e964) is 8.0M, max 153.6M, 145.6M free.
Feb 17 16:37:28 localhost systemd-journald[693]: Received client request to flush runtime journal.
Feb 17 16:37:28 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Feb 17 16:37:28 localhost systemd[1]: Finished Load/Save OS Random Seed.
Feb 17 16:37:28 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 17 16:37:28 localhost systemd[1]: Finished Create System Users.
Feb 17 16:37:28 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 17 16:37:28 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 17 16:37:28 localhost systemd[1]: Reached target Preparation for Local File Systems.
Feb 17 16:37:28 localhost systemd[1]: Reached target Local File Systems.
Feb 17 16:37:28 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Feb 17 16:37:28 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Feb 17 16:37:28 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Feb 17 16:37:28 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Feb 17 16:37:28 localhost systemd[1]: Starting Automatic Boot Loader Update...
Feb 17 16:37:28 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Feb 17 16:37:28 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 17 16:37:28 localhost bootctl[711]: Couldn't find EFI system partition, skipping.
Feb 17 16:37:28 localhost systemd[1]: Finished Automatic Boot Loader Update.
Feb 17 16:37:28 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 17 16:37:28 localhost systemd[1]: Starting Security Auditing Service...
Feb 17 16:37:28 localhost systemd[1]: Starting RPC Bind...
Feb 17 16:37:28 localhost systemd[1]: Starting Rebuild Journal Catalog...
Feb 17 16:37:28 localhost auditd[717]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Feb 17 16:37:28 localhost auditd[717]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Feb 17 16:37:28 localhost systemd[1]: Finished Rebuild Journal Catalog.
Feb 17 16:37:28 localhost systemd[1]: Started RPC Bind.
Feb 17 16:37:28 localhost augenrules[722]: /sbin/augenrules: No change
Feb 17 16:37:28 localhost augenrules[737]: No rules
Feb 17 16:37:28 localhost augenrules[737]: enabled 1
Feb 17 16:37:28 localhost augenrules[737]: failure 1
Feb 17 16:37:28 localhost augenrules[737]: pid 717
Feb 17 16:37:28 localhost augenrules[737]: rate_limit 0
Feb 17 16:37:28 localhost augenrules[737]: backlog_limit 8192
Feb 17 16:37:28 localhost augenrules[737]: lost 0
Feb 17 16:37:28 localhost augenrules[737]: backlog 4
Feb 17 16:37:28 localhost augenrules[737]: backlog_wait_time 60000
Feb 17 16:37:28 localhost augenrules[737]: backlog_wait_time_actual 0
Feb 17 16:37:28 localhost augenrules[737]: enabled 1
Feb 17 16:37:28 localhost augenrules[737]: failure 1
Feb 17 16:37:28 localhost augenrules[737]: pid 717
Feb 17 16:37:28 localhost augenrules[737]: rate_limit 0
Feb 17 16:37:28 localhost augenrules[737]: backlog_limit 8192
Feb 17 16:37:28 localhost augenrules[737]: lost 0
Feb 17 16:37:28 localhost augenrules[737]: backlog 3
Feb 17 16:37:28 localhost augenrules[737]: backlog_wait_time 60000
Feb 17 16:37:28 localhost augenrules[737]: backlog_wait_time_actual 0
Feb 17 16:37:28 localhost augenrules[737]: enabled 1
Feb 17 16:37:28 localhost augenrules[737]: failure 1
Feb 17 16:37:28 localhost augenrules[737]: pid 717
Feb 17 16:37:28 localhost augenrules[737]: rate_limit 0
Feb 17 16:37:28 localhost augenrules[737]: backlog_limit 8192
Feb 17 16:37:28 localhost augenrules[737]: lost 0
Feb 17 16:37:28 localhost augenrules[737]: backlog 3
Feb 17 16:37:28 localhost augenrules[737]: backlog_wait_time 60000
Feb 17 16:37:28 localhost augenrules[737]: backlog_wait_time_actual 0
Feb 17 16:37:28 localhost systemd[1]: Started Security Auditing Service.
Feb 17 16:37:28 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Feb 17 16:37:28 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Feb 17 16:37:28 localhost systemd[1]: Finished Rebuild Hardware Database.
Feb 17 16:37:28 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 17 16:37:28 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Feb 17 16:37:28 localhost systemd[1]: Starting Update is Completed...
Feb 17 16:37:28 localhost systemd-udevd[745]: Using default interface naming scheme 'rhel-9.0'.
Feb 17 16:37:28 localhost systemd[1]: Finished Update is Completed.
Feb 17 16:37:28 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 17 16:37:28 localhost systemd[1]: Reached target System Initialization.
Feb 17 16:37:28 localhost systemd[1]: Started dnf makecache --timer.
Feb 17 16:37:28 localhost systemd[1]: Started Daily rotation of log files.
Feb 17 16:37:28 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Feb 17 16:37:28 localhost systemd[1]: Reached target Timer Units.
Feb 17 16:37:28 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 17 16:37:28 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Feb 17 16:37:28 localhost systemd[1]: Reached target Socket Units.
Feb 17 16:37:28 localhost systemd[1]: Starting D-Bus System Message Bus...
Feb 17 16:37:28 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 17 16:37:28 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Feb 17 16:37:28 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 17 16:37:28 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 17 16:37:28 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 17 16:37:28 localhost systemd-udevd[756]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 16:37:28 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Feb 17 16:37:28 localhost systemd[1]: Started D-Bus System Message Bus.
Feb 17 16:37:28 localhost systemd[1]: Reached target Basic System.
Feb 17 16:37:28 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Feb 17 16:37:28 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Feb 17 16:37:28 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Feb 17 16:37:28 localhost dbus-broker-lau[781]: Ready
Feb 17 16:37:28 localhost systemd[1]: Starting NTP client/server...
Feb 17 16:37:28 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Feb 17 16:37:28 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Feb 17 16:37:28 localhost systemd[1]: Starting IPv4 firewall with iptables...
Feb 17 16:37:29 localhost systemd[1]: Started irqbalance daemon.
Feb 17 16:37:29 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Feb 17 16:37:29 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 17 16:37:29 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 17 16:37:29 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 17 16:37:29 localhost systemd[1]: Reached target sshd-keygen.target.
Feb 17 16:37:29 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Feb 17 16:37:29 localhost systemd[1]: Reached target User and Group Name Lookups.
Feb 17 16:37:29 localhost systemd[1]: Starting User Login Management...
Feb 17 16:37:29 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Feb 17 16:37:29 localhost chronyd[814]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Feb 17 16:37:29 localhost chronyd[814]: Loaded 0 symmetric keys
Feb 17 16:37:29 localhost chronyd[814]: Using right/UTC timezone to obtain leap second data
Feb 17 16:37:29 localhost chronyd[814]: Loaded seccomp filter (level 2)
Feb 17 16:37:29 localhost systemd[1]: Started NTP client/server.
Feb 17 16:37:29 localhost systemd-logind[806]: New seat seat0.
Feb 17 16:37:29 localhost systemd-logind[806]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 17 16:37:29 localhost systemd-logind[806]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 17 16:37:29 localhost systemd[1]: Started User Login Management.
Feb 17 16:37:29 localhost kernel: kvm_amd: TSC scaling supported
Feb 17 16:37:29 localhost kernel: kvm_amd: Nested Virtualization enabled
Feb 17 16:37:29 localhost kernel: kvm_amd: Nested Paging enabled
Feb 17 16:37:29 localhost kernel: kvm_amd: LBR virtualization supported
Feb 17 16:37:29 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Feb 17 16:37:29 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Feb 17 16:37:29 localhost iptables.init[797]: iptables: Applying firewall rules: [  OK  ]
Feb 17 16:37:29 localhost systemd[1]: Finished IPv4 firewall with iptables.
Feb 17 16:37:29 localhost cloud-init[850]: Cloud-init v. 24.4-8.el9 running 'init-local' at Tue, 17 Feb 2026 16:37:29 +0000. Up 6.07 seconds.
Feb 17 16:37:29 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Feb 17 16:37:29 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Feb 17 16:37:29 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpfzunxw5v.mount: Deactivated successfully.
Feb 17 16:37:29 localhost systemd[1]: Starting Hostname Service...
Feb 17 16:37:29 localhost systemd[1]: Started Hostname Service.
Feb 17 16:37:29 np0005622237.novalocal systemd-hostnamed[864]: Hostname set to <np0005622237.novalocal> (static)
Feb 17 16:37:30 np0005622237.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Feb 17 16:37:30 np0005622237.novalocal systemd[1]: Reached target Preparation for Network.
Feb 17 16:37:30 np0005622237.novalocal systemd[1]: Starting Network Manager...
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.1753] NetworkManager (version 1.54.3-2.el9) is starting... (boot:1da25986-a7c7-4f2a-b760-e7b6d26f1215)
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.1760] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.1924] manager[0x55f9825f5000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.1981] hostname: hostname: using hostnamed
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.1981] hostname: static hostname changed from (none) to "np0005622237.novalocal"
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.1987] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2158] manager[0x55f9825f5000]: rfkill: Wi-Fi hardware radio set enabled
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2158] manager[0x55f9825f5000]: rfkill: WWAN hardware radio set enabled
Feb 17 16:37:30 np0005622237.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2260] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2260] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2261] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2262] manager: Networking is enabled by state file
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2264] settings: Loaded settings plugin: keyfile (internal)
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2296] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2330] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2349] dhcp: init: Using DHCP client 'internal'
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2354] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2377] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2394] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2413] device (lo): Activation: starting connection 'lo' (408cebb5-d164-4b54-9d84-326ea0ceda94)
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2426] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 17 16:37:30 np0005622237.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2431] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2470] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 17 16:37:30 np0005622237.novalocal systemd[1]: Started Network Manager.
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2478] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2482] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2484] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2487] device (eth0): carrier: link connected
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2493] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2502] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Feb 17 16:37:30 np0005622237.novalocal systemd[1]: Reached target Network.
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2510] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2519] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2521] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2525] manager: NetworkManager state is now CONNECTING
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2527] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2540] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2544] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 17 16:37:30 np0005622237.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2601] dhcp4 (eth0): state changed new lease, address=38.102.83.53
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2615] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 17 16:37:30 np0005622237.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2647] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 17 16:37:30 np0005622237.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2848] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2854] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2861] device (lo): Activation: successful, device activated.
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.2871] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 17 16:37:30 np0005622237.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Feb 17 16:37:30 np0005622237.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 17 16:37:30 np0005622237.novalocal systemd[1]: Reached target NFS client services.
Feb 17 16:37:30 np0005622237.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Feb 17 16:37:30 np0005622237.novalocal systemd[1]: Reached target Remote File Systems.
Feb 17 16:37:30 np0005622237.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.3026] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.3031] manager: NetworkManager state is now CONNECTED_SITE
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.3034] device (eth0): Activation: successful, device activated.
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.3041] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 17 16:37:30 np0005622237.novalocal NetworkManager[868]: <info>  [1771346250.3044] manager: startup complete
Feb 17 16:37:30 np0005622237.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 17 16:37:30 np0005622237.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: Cloud-init v. 24.4-8.el9 running 'init' at Tue, 17 Feb 2026 16:37:30 +0000. Up 7.06 seconds.
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: |  eth0  | True |         38.102.83.53         | 255.255.255.0 | global | fa:16:3e:76:88:3c |
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: |  eth0  | True | fe80::f816:3eff:fe76:883c/64 |       .       |  link  | fa:16:3e:76:88:3c |
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Feb 17 16:37:30 np0005622237.novalocal cloud-init[932]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 17 16:37:31 np0005622237.novalocal useradd[998]: new group: name=cloud-user, GID=1001
Feb 17 16:37:31 np0005622237.novalocal useradd[998]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Feb 17 16:37:31 np0005622237.novalocal useradd[998]: add 'cloud-user' to group 'adm'
Feb 17 16:37:31 np0005622237.novalocal useradd[998]: add 'cloud-user' to group 'systemd-journal'
Feb 17 16:37:31 np0005622237.novalocal useradd[998]: add 'cloud-user' to shadow group 'adm'
Feb 17 16:37:31 np0005622237.novalocal useradd[998]: add 'cloud-user' to shadow group 'systemd-journal'
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: Generating public/private rsa key pair.
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: The key fingerprint is:
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: SHA256:esb3TtmZEIcLAuq2BxnDfdX3MAsWx+/6U4WXB8JPz0A root@np0005622237.novalocal
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: The key's randomart image is:
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: +---[RSA 3072]----+
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |      .   .o.+E  |
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |   . o . .  *oO  |
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |    = . o ..o*.%.|
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |   . + . . . ++.O|
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |    =   S   o  oo|
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |   . o o     + oo|
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |    . o + . o +..|
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |     . o . o  .. |
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |           .o  .o|
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: +----[SHA256]-----+
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: Generating public/private ecdsa key pair.
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: The key fingerprint is:
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: SHA256:p4GikhNrfRNtzIz2U3a4R578MDjGn6E57eniUqytujQ root@np0005622237.novalocal
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: The key's randomart image is:
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: +---[ECDSA 256]---+
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |                 |
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |                 |
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |                 |
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |      *.  .      |
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |.   .+.*S+.o     |
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: | +....+ +*B .    |
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |=... E o=*.O     |
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |.o  o ooo+*.B    |
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |     oo.++=* .   |
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: +----[SHA256]-----+
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: Generating public/private ed25519 key pair.
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: The key fingerprint is:
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: SHA256:xBOOneTR/M2loYgvv1EzZp8FY4hEAw07rtKSGxPAbZ8 root@np0005622237.novalocal
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: The key's randomart image is:
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: +--[ED25519 256]--+
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |        *B+      |
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: | . .   B *+o .. .|
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |  o o . @..o.+++ |
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |   o . +.o. o.+o |
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |    . E S.  *   .|
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |     + .. .+ + o |
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |    * o  o.   o  |
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |     *    ..     |
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: |    .     ..     |
Feb 17 16:37:31 np0005622237.novalocal cloud-init[932]: +----[SHA256]-----+
Feb 17 16:37:31 np0005622237.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Feb 17 16:37:31 np0005622237.novalocal systemd[1]: Reached target Cloud-config availability.
Feb 17 16:37:31 np0005622237.novalocal systemd[1]: Reached target Network is Online.
Feb 17 16:37:31 np0005622237.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Feb 17 16:37:31 np0005622237.novalocal systemd[1]: Starting Crash recovery kernel arming...
Feb 17 16:37:31 np0005622237.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Feb 17 16:37:31 np0005622237.novalocal systemd[1]: Starting System Logging Service...
Feb 17 16:37:31 np0005622237.novalocal sm-notify[1014]: Version 2.5.4 starting
Feb 17 16:37:31 np0005622237.novalocal systemd[1]: Starting OpenSSH server daemon...
Feb 17 16:37:31 np0005622237.novalocal systemd[1]: Starting Permit User Sessions...
Feb 17 16:37:31 np0005622237.novalocal systemd[1]: Started Notify NFS peers of a restart.
Feb 17 16:37:31 np0005622237.novalocal sshd[1016]: Server listening on 0.0.0.0 port 22.
Feb 17 16:37:31 np0005622237.novalocal sshd[1016]: Server listening on :: port 22.
Feb 17 16:37:31 np0005622237.novalocal systemd[1]: Started OpenSSH server daemon.
Feb 17 16:37:31 np0005622237.novalocal systemd[1]: Finished Permit User Sessions.
Feb 17 16:37:31 np0005622237.novalocal systemd[1]: Started Command Scheduler.
Feb 17 16:37:31 np0005622237.novalocal systemd[1]: Started Getty on tty1.
Feb 17 16:37:31 np0005622237.novalocal systemd[1]: Started Serial Getty on ttyS0.
Feb 17 16:37:31 np0005622237.novalocal rsyslogd[1015]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1015" x-info="https://www.rsyslog.com"] start
Feb 17 16:37:31 np0005622237.novalocal rsyslogd[1015]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Feb 17 16:37:31 np0005622237.novalocal crond[1020]: (CRON) STARTUP (1.5.7)
Feb 17 16:37:31 np0005622237.novalocal crond[1020]: (CRON) INFO (Syslog will be used instead of sendmail.)
Feb 17 16:37:31 np0005622237.novalocal systemd[1]: Reached target Login Prompts.
Feb 17 16:37:31 np0005622237.novalocal crond[1020]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 85% if used.)
Feb 17 16:37:31 np0005622237.novalocal crond[1020]: (CRON) INFO (running with inotify support)
Feb 17 16:37:31 np0005622237.novalocal systemd[1]: Started System Logging Service.
Feb 17 16:37:31 np0005622237.novalocal systemd[1]: Reached target Multi-User System.
Feb 17 16:37:31 np0005622237.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Feb 17 16:37:31 np0005622237.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Feb 17 16:37:31 np0005622237.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Feb 17 16:37:32 np0005622237.novalocal rsyslogd[1015]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 17 16:37:32 np0005622237.novalocal kdumpctl[1029]: kdump: No kdump initial ramdisk found.
Feb 17 16:37:32 np0005622237.novalocal kdumpctl[1029]: kdump: Rebuilding /boot/initramfs-5.14.0-681.el9.x86_64kdump.img
Feb 17 16:37:32 np0005622237.novalocal cloud-init[1174]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Tue, 17 Feb 2026 16:37:32 +0000. Up 8.63 seconds.
Feb 17 16:37:32 np0005622237.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Feb 17 16:37:32 np0005622237.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Feb 17 16:37:32 np0005622237.novalocal cloud-init[1441]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Tue, 17 Feb 2026 16:37:32 +0000. Up 8.98 seconds.
Feb 17 16:37:32 np0005622237.novalocal cloud-init[1470]: #############################################################
Feb 17 16:37:32 np0005622237.novalocal cloud-init[1474]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Feb 17 16:37:32 np0005622237.novalocal cloud-init[1481]: 256 SHA256:p4GikhNrfRNtzIz2U3a4R578MDjGn6E57eniUqytujQ root@np0005622237.novalocal (ECDSA)
Feb 17 16:37:32 np0005622237.novalocal cloud-init[1488]: 256 SHA256:xBOOneTR/M2loYgvv1EzZp8FY4hEAw07rtKSGxPAbZ8 root@np0005622237.novalocal (ED25519)
Feb 17 16:37:32 np0005622237.novalocal cloud-init[1493]: 3072 SHA256:esb3TtmZEIcLAuq2BxnDfdX3MAsWx+/6U4WXB8JPz0A root@np0005622237.novalocal (RSA)
Feb 17 16:37:32 np0005622237.novalocal cloud-init[1498]: -----END SSH HOST KEY FINGERPRINTS-----
Feb 17 16:37:32 np0005622237.novalocal cloud-init[1500]: #############################################################
Feb 17 16:37:32 np0005622237.novalocal cloud-init[1441]: Cloud-init v. 24.4-8.el9 finished at Tue, 17 Feb 2026 16:37:32 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 9.14 seconds
Feb 17 16:37:32 np0005622237.novalocal dracut[1521]: dracut-057-110.git20260130.el9
Feb 17 16:37:32 np0005622237.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Feb 17 16:37:32 np0005622237.novalocal systemd[1]: Reached target Cloud-init target.
Feb 17 16:37:32 np0005622237.novalocal dracut[1523]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/9d578f93-c4e9-4172-8459-ef150e54751c /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-681.el9.x86_64kdump.img 5.14.0-681.el9.x86_64
Feb 17 16:37:32 np0005622237.novalocal sshd-session[1582]: Connection reset by 38.102.83.114 port 34274 [preauth]
Feb 17 16:37:32 np0005622237.novalocal sshd-session[1591]: Unable to negotiate with 38.102.83.114 port 34286: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Feb 17 16:37:32 np0005622237.novalocal sshd-session[1593]: Connection reset by 38.102.83.114 port 34294 [preauth]
Feb 17 16:37:32 np0005622237.novalocal sshd-session[1595]: Unable to negotiate with 38.102.83.114 port 34300: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Feb 17 16:37:32 np0005622237.novalocal sshd-session[1600]: Unable to negotiate with 38.102.83.114 port 34314: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Feb 17 16:37:33 np0005622237.novalocal sshd-session[1605]: Connection reset by 38.102.83.114 port 34328 [preauth]
Feb 17 16:37:33 np0005622237.novalocal sshd-session[1613]: Connection reset by 38.102.83.114 port 34342 [preauth]
Feb 17 16:37:33 np0005622237.novalocal sshd-session[1618]: Unable to negotiate with 38.102.83.114 port 34354: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Feb 17 16:37:33 np0005622237.novalocal sshd-session[1626]: Unable to negotiate with 38.102.83.114 port 34366: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: Module 'resume' will not be installed, because it's in the list to be omitted!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: memstrack is not available
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 17 16:37:33 np0005622237.novalocal dracut[1523]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 17 16:37:34 np0005622237.novalocal dracut[1523]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 17 16:37:34 np0005622237.novalocal dracut[1523]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 17 16:37:34 np0005622237.novalocal dracut[1523]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 17 16:37:34 np0005622237.novalocal dracut[1523]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 17 16:37:34 np0005622237.novalocal dracut[1523]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 17 16:37:34 np0005622237.novalocal dracut[1523]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 17 16:37:34 np0005622237.novalocal dracut[1523]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 17 16:37:34 np0005622237.novalocal dracut[1523]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 17 16:37:34 np0005622237.novalocal dracut[1523]: memstrack is not available
Feb 17 16:37:34 np0005622237.novalocal dracut[1523]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 17 16:37:34 np0005622237.novalocal dracut[1523]: *** Including module: systemd ***
Feb 17 16:37:34 np0005622237.novalocal dracut[1523]: *** Including module: fips ***
Feb 17 16:37:34 np0005622237.novalocal dracut[1523]: *** Including module: systemd-initrd ***
Feb 17 16:37:34 np0005622237.novalocal dracut[1523]: *** Including module: i18n ***
Feb 17 16:37:34 np0005622237.novalocal dracut[1523]: *** Including module: drm ***
Feb 17 16:37:35 np0005622237.novalocal chronyd[814]: Selected source 206.108.0.132 (2.centos.pool.ntp.org)
Feb 17 16:37:35 np0005622237.novalocal chronyd[814]: System clock wrong by 1.113545 seconds
Feb 17 16:37:36 np0005622237.novalocal chronyd[814]: System clock was stepped by 1.113545 seconds
Feb 17 16:37:36 np0005622237.novalocal chronyd[814]: System clock TAI offset set to 37 seconds
Feb 17 16:37:36 np0005622237.novalocal dracut[1523]: *** Including module: prefixdevname ***
Feb 17 16:37:36 np0005622237.novalocal dracut[1523]: *** Including module: kernel-modules ***
Feb 17 16:37:36 np0005622237.novalocal kernel: block vda: the capability attribute has been deprecated.
Feb 17 16:37:37 np0005622237.novalocal dracut[1523]: *** Including module: kernel-modules-extra ***
Feb 17 16:37:37 np0005622237.novalocal dracut[1523]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Feb 17 16:37:37 np0005622237.novalocal dracut[1523]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Feb 17 16:37:37 np0005622237.novalocal dracut[1523]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Feb 17 16:37:37 np0005622237.novalocal dracut[1523]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Feb 17 16:37:37 np0005622237.novalocal dracut[1523]: *** Including module: qemu ***
Feb 17 16:37:37 np0005622237.novalocal dracut[1523]: *** Including module: fstab-sys ***
Feb 17 16:37:37 np0005622237.novalocal dracut[1523]: *** Including module: rootfs-block ***
Feb 17 16:37:37 np0005622237.novalocal dracut[1523]: *** Including module: terminfo ***
Feb 17 16:37:37 np0005622237.novalocal dracut[1523]: *** Including module: udev-rules ***
Feb 17 16:37:37 np0005622237.novalocal dracut[1523]: Skipping udev rule: 91-permissions.rules
Feb 17 16:37:37 np0005622237.novalocal dracut[1523]: Skipping udev rule: 80-drivers-modprobe.rules
Feb 17 16:37:37 np0005622237.novalocal dracut[1523]: *** Including module: virtiofs ***
Feb 17 16:37:37 np0005622237.novalocal dracut[1523]: *** Including module: dracut-systemd ***
Feb 17 16:37:37 np0005622237.novalocal dracut[1523]: *** Including module: usrmount ***
Feb 17 16:37:37 np0005622237.novalocal dracut[1523]: *** Including module: base ***
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]: *** Including module: fs-lib ***
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]: *** Including module: kdumpbase ***
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]: *** Including module: microcode_ctl-fw_dir_override ***
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:   microcode_ctl module: mangling fw_dir
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:     microcode_ctl: configuration "intel" is ignored
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]: *** Including module: openssl ***
Feb 17 16:37:38 np0005622237.novalocal dracut[1523]: *** Including module: shutdown ***
Feb 17 16:37:39 np0005622237.novalocal dracut[1523]: *** Including module: squash ***
Feb 17 16:37:39 np0005622237.novalocal dracut[1523]: *** Including modules done ***
Feb 17 16:37:39 np0005622237.novalocal dracut[1523]: *** Installing kernel module dependencies ***
Feb 17 16:37:39 np0005622237.novalocal dracut[1523]: *** Installing kernel module dependencies done ***
Feb 17 16:37:39 np0005622237.novalocal dracut[1523]: *** Resolving executable dependencies ***
Feb 17 16:37:40 np0005622237.novalocal irqbalance[800]: Cannot change IRQ 35 affinity: Operation not permitted
Feb 17 16:37:40 np0005622237.novalocal irqbalance[800]: IRQ 35 affinity is now unmanaged
Feb 17 16:37:40 np0005622237.novalocal irqbalance[800]: Cannot change IRQ 33 affinity: Operation not permitted
Feb 17 16:37:40 np0005622237.novalocal irqbalance[800]: IRQ 33 affinity is now unmanaged
Feb 17 16:37:40 np0005622237.novalocal irqbalance[800]: Cannot change IRQ 31 affinity: Operation not permitted
Feb 17 16:37:40 np0005622237.novalocal irqbalance[800]: IRQ 31 affinity is now unmanaged
Feb 17 16:37:40 np0005622237.novalocal irqbalance[800]: Cannot change IRQ 28 affinity: Operation not permitted
Feb 17 16:37:40 np0005622237.novalocal irqbalance[800]: IRQ 28 affinity is now unmanaged
Feb 17 16:37:40 np0005622237.novalocal irqbalance[800]: Cannot change IRQ 34 affinity: Operation not permitted
Feb 17 16:37:40 np0005622237.novalocal irqbalance[800]: IRQ 34 affinity is now unmanaged
Feb 17 16:37:40 np0005622237.novalocal irqbalance[800]: Cannot change IRQ 32 affinity: Operation not permitted
Feb 17 16:37:40 np0005622237.novalocal irqbalance[800]: IRQ 32 affinity is now unmanaged
Feb 17 16:37:40 np0005622237.novalocal irqbalance[800]: Cannot change IRQ 30 affinity: Operation not permitted
Feb 17 16:37:40 np0005622237.novalocal irqbalance[800]: IRQ 30 affinity is now unmanaged
Feb 17 16:37:40 np0005622237.novalocal irqbalance[800]: Cannot change IRQ 29 affinity: Operation not permitted
Feb 17 16:37:40 np0005622237.novalocal irqbalance[800]: IRQ 29 affinity is now unmanaged
Feb 17 16:37:41 np0005622237.novalocal dracut[1523]: *** Resolving executable dependencies done ***
Feb 17 16:37:41 np0005622237.novalocal dracut[1523]: *** Generating early-microcode cpio image ***
Feb 17 16:37:41 np0005622237.novalocal dracut[1523]: *** Store current command line parameters ***
Feb 17 16:37:41 np0005622237.novalocal dracut[1523]: Stored kernel commandline:
Feb 17 16:37:41 np0005622237.novalocal dracut[1523]: No dracut internal kernel commandline stored in the initramfs
Feb 17 16:37:41 np0005622237.novalocal dracut[1523]: *** Install squash loader ***
Feb 17 16:37:41 np0005622237.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 17 16:37:42 np0005622237.novalocal dracut[1523]: *** Squashing the files inside the initramfs ***
Feb 17 16:37:43 np0005622237.novalocal dracut[1523]: *** Squashing the files inside the initramfs done ***
Feb 17 16:37:43 np0005622237.novalocal dracut[1523]: *** Creating image file '/boot/initramfs-5.14.0-681.el9.x86_64kdump.img' ***
Feb 17 16:37:43 np0005622237.novalocal dracut[1523]: *** Hardlinking files ***
Feb 17 16:37:43 np0005622237.novalocal dracut[1523]: Mode:           real
Feb 17 16:37:43 np0005622237.novalocal dracut[1523]: Files:          50
Feb 17 16:37:43 np0005622237.novalocal dracut[1523]: Linked:         0 files
Feb 17 16:37:43 np0005622237.novalocal dracut[1523]: Compared:       0 xattrs
Feb 17 16:37:43 np0005622237.novalocal dracut[1523]: Compared:       0 files
Feb 17 16:37:43 np0005622237.novalocal dracut[1523]: Saved:          0 B
Feb 17 16:37:43 np0005622237.novalocal dracut[1523]: Duration:       0.000441 seconds
Feb 17 16:37:43 np0005622237.novalocal dracut[1523]: *** Hardlinking files done ***
Feb 17 16:37:43 np0005622237.novalocal dracut[1523]: *** Creating initramfs image file '/boot/initramfs-5.14.0-681.el9.x86_64kdump.img' done ***
Feb 17 16:37:44 np0005622237.novalocal kdumpctl[1029]: kdump: kexec: loaded kdump kernel
Feb 17 16:37:44 np0005622237.novalocal kdumpctl[1029]: kdump: Starting kdump: [OK]
Feb 17 16:37:44 np0005622237.novalocal systemd[1]: Finished Crash recovery kernel arming.
Feb 17 16:37:44 np0005622237.novalocal systemd[1]: Startup finished in 1.335s (kernel) + 2.552s (initrd) + 15.646s (userspace) = 19.534s.
Feb 17 16:37:48 np0005622237.novalocal sshd-session[4797]: Accepted publickey for zuul from 38.102.83.114 port 58878 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Feb 17 16:37:48 np0005622237.novalocal systemd[1]: Created slice User Slice of UID 1000.
Feb 17 16:37:48 np0005622237.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Feb 17 16:37:48 np0005622237.novalocal systemd-logind[806]: New session 1 of user zuul.
Feb 17 16:37:48 np0005622237.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Feb 17 16:37:48 np0005622237.novalocal systemd[1]: Starting User Manager for UID 1000...
Feb 17 16:37:48 np0005622237.novalocal systemd[4801]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 16:37:49 np0005622237.novalocal systemd[4801]: Queued start job for default target Main User Target.
Feb 17 16:37:49 np0005622237.novalocal systemd[4801]: Created slice User Application Slice.
Feb 17 16:37:49 np0005622237.novalocal systemd[4801]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 17 16:37:49 np0005622237.novalocal systemd[4801]: Started Daily Cleanup of User's Temporary Directories.
Feb 17 16:37:49 np0005622237.novalocal systemd[4801]: Reached target Paths.
Feb 17 16:37:49 np0005622237.novalocal systemd[4801]: Reached target Timers.
Feb 17 16:37:49 np0005622237.novalocal systemd[4801]: Starting D-Bus User Message Bus Socket...
Feb 17 16:37:49 np0005622237.novalocal systemd[4801]: Starting Create User's Volatile Files and Directories...
Feb 17 16:37:49 np0005622237.novalocal systemd[4801]: Finished Create User's Volatile Files and Directories.
Feb 17 16:37:49 np0005622237.novalocal systemd[4801]: Listening on D-Bus User Message Bus Socket.
Feb 17 16:37:49 np0005622237.novalocal systemd[4801]: Reached target Sockets.
Feb 17 16:37:49 np0005622237.novalocal systemd[4801]: Reached target Basic System.
Feb 17 16:37:49 np0005622237.novalocal systemd[4801]: Reached target Main User Target.
Feb 17 16:37:49 np0005622237.novalocal systemd[4801]: Startup finished in 153ms.
Feb 17 16:37:49 np0005622237.novalocal systemd[1]: Started User Manager for UID 1000.
Feb 17 16:37:49 np0005622237.novalocal systemd[1]: Started Session 1 of User zuul.
Feb 17 16:37:49 np0005622237.novalocal sshd-session[4797]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 16:37:49 np0005622237.novalocal python3[4883]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 16:37:52 np0005622237.novalocal python3[4911]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 16:37:59 np0005622237.novalocal python3[4969]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 16:38:00 np0005622237.novalocal python3[5009]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Feb 17 16:38:01 np0005622237.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 17 16:38:02 np0005622237.novalocal python3[5037]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCkYKBzeGt494Xy7vL16NUCHMOu5190Plp2T+sqgdCbBCmHUDwh+aWkOGWe4zctJ5wXZ4cVqhI/6Uf4E7kInzuyQEa/zyVVHXzNUL1EoZY1LXEKRZjUG29A+U3pDkvGLGcNyebLYBDhL07O/kOTArWwi0m1ynah3ckx5FDpafpBGf5BqpuYC+ruVOQ6qIDU/MrtV7zX+F4JdsIiQWB4WlCDXvmVZRv01D0+6h/Eu/55K04HOvKj/0LDfaFeGXkIG/xxMjFzhadX4JW15397f10EseuMgcZOcIdifr/ieK+1TATdEWR8MOSlp61jrbFsqGUmRilD45TczSBIdC1NJr/9t3o1HY3zER5o5RuzAZ2K4BWYI8iozexpEIfTm/lN4NHKJjPvM8546tmjcpjYf2vUJXszC8bWGdbRg4y1iZ5V0fvJ9LyMw0bsGVlgG7iFz6WgfgBskNRnNey3NuqDKIBjSEiJ/8m3c2yRWDm8HFIPlBpMbH12pXrf0WW/cZzGjBU= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:03 np0005622237.novalocal python3[5061]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:38:03 np0005622237.novalocal python3[5160]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 17 16:38:03 np0005622237.novalocal python3[5231]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771346283.3125882-207-182699142863904/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=193124da50294d128b68d9674a8edcc6_id_rsa follow=False checksum=ae4f4eaab2a37959c4068474d29b183fc20a9a5c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:38:04 np0005622237.novalocal python3[5354]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 17 16:38:04 np0005622237.novalocal python3[5425]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771346284.2755673-240-133327645552041/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=193124da50294d128b68d9674a8edcc6_id_rsa.pub follow=False checksum=aab8f19f8a55a452763cd869d399de6a094320ab backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:38:06 np0005622237.novalocal python3[5473]: ansible-ping Invoked with data=pong
Feb 17 16:38:07 np0005622237.novalocal python3[5497]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 16:38:09 np0005622237.novalocal python3[5555]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Feb 17 16:38:11 np0005622237.novalocal python3[5587]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:38:13 np0005622237.novalocal python3[5611]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:38:13 np0005622237.novalocal python3[5635]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:38:14 np0005622237.novalocal python3[5659]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:38:15 np0005622237.novalocal python3[5683]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:38:15 np0005622237.novalocal python3[5707]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:38:18 np0005622237.novalocal sudo[5731]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsecmautzisuhcgsjocvrbtizatchiug ; /usr/bin/python3'
Feb 17 16:38:18 np0005622237.novalocal sudo[5731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:38:24 np0005622237.novalocal python3[5733]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:38:24 np0005622237.novalocal sudo[5731]: pam_unix(sudo:session): session closed for user root
Feb 17 16:38:24 np0005622237.novalocal sudo[5809]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqhensnmjtuttmvuyvmvjvknazjwsues ; /usr/bin/python3'
Feb 17 16:38:24 np0005622237.novalocal sudo[5809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:38:24 np0005622237.novalocal python3[5811]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 17 16:38:24 np0005622237.novalocal sudo[5809]: pam_unix(sudo:session): session closed for user root
Feb 17 16:38:25 np0005622237.novalocal sudo[5882]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hoarepamyxlgdurfjkjtiirqgmhkaytj ; /usr/bin/python3'
Feb 17 16:38:25 np0005622237.novalocal sudo[5882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:38:25 np0005622237.novalocal python3[5884]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771346304.5471194-21-237696587705544/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:38:25 np0005622237.novalocal sudo[5882]: pam_unix(sudo:session): session closed for user root
Feb 17 16:38:25 np0005622237.novalocal python3[5932]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:26 np0005622237.novalocal python3[5956]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:26 np0005622237.novalocal python3[5980]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:26 np0005622237.novalocal python3[6004]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:26 np0005622237.novalocal python3[6028]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:27 np0005622237.novalocal python3[6052]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:27 np0005622237.novalocal python3[6076]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:27 np0005622237.novalocal python3[6100]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:27 np0005622237.novalocal python3[6124]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:28 np0005622237.novalocal python3[6148]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:28 np0005622237.novalocal python3[6172]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:28 np0005622237.novalocal python3[6196]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:28 np0005622237.novalocal python3[6220]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICWBreHW95Wz2Toz5YwCGQwFcUG8oFYkienDh9tntmDc ralfieri@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:29 np0005622237.novalocal python3[6244]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:29 np0005622237.novalocal python3[6268]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:29 np0005622237.novalocal python3[6292]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:29 np0005622237.novalocal python3[6316]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:30 np0005622237.novalocal python3[6340]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:30 np0005622237.novalocal python3[6364]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:30 np0005622237.novalocal python3[6388]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:30 np0005622237.novalocal python3[6412]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:31 np0005622237.novalocal python3[6436]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:31 np0005622237.novalocal python3[6460]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:31 np0005622237.novalocal python3[6484]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:31 np0005622237.novalocal python3[6508]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:32 np0005622237.novalocal python3[6532]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:38:35 np0005622237.novalocal sudo[6556]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngpugtlddysqxphinkyeeugzfprkfsev ; /usr/bin/python3'
Feb 17 16:38:35 np0005622237.novalocal sudo[6556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:38:35 np0005622237.novalocal python3[6558]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 17 16:38:35 np0005622237.novalocal systemd[1]: Starting Time & Date Service...
Feb 17 16:38:35 np0005622237.novalocal systemd[1]: Started Time & Date Service.
Feb 17 16:38:35 np0005622237.novalocal systemd-timedated[6560]: Changed time zone to 'UTC' (UTC).
Feb 17 16:38:35 np0005622237.novalocal sudo[6556]: pam_unix(sudo:session): session closed for user root
Feb 17 16:38:37 np0005622237.novalocal sudo[6587]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mothizbybarhigodfcigkospmdlffyqn ; /usr/bin/python3'
Feb 17 16:38:37 np0005622237.novalocal sudo[6587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:38:37 np0005622237.novalocal python3[6589]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:38:37 np0005622237.novalocal sudo[6587]: pam_unix(sudo:session): session closed for user root
Feb 17 16:38:37 np0005622237.novalocal python3[6665]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 17 16:38:37 np0005622237.novalocal python3[6736]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1771346317.4891295-153-139434566762631/source _original_basename=tmpke25sm3e follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:38:38 np0005622237.novalocal python3[6836]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 17 16:38:38 np0005622237.novalocal python3[6907]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1771346318.3765292-183-233544852735172/source _original_basename=tmp7rt139v7 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:38:39 np0005622237.novalocal sudo[7007]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aswoujtduhxmmqzwakzpfbpuprghuckl ; /usr/bin/python3'
Feb 17 16:38:39 np0005622237.novalocal sudo[7007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:38:39 np0005622237.novalocal python3[7009]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 17 16:38:39 np0005622237.novalocal sudo[7007]: pam_unix(sudo:session): session closed for user root
Feb 17 16:38:39 np0005622237.novalocal sudo[7080]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxrblnslbtilgdzrilhyjnrjjyfqlojd ; /usr/bin/python3'
Feb 17 16:38:39 np0005622237.novalocal sudo[7080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:38:40 np0005622237.novalocal python3[7082]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1771346319.467453-231-268710203741870/source _original_basename=tmp23zqn5sd follow=False checksum=1cc2ea2b76967ada2d4710a35e138c3751da2100 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:38:40 np0005622237.novalocal sudo[7080]: pam_unix(sudo:session): session closed for user root
Feb 17 16:38:41 np0005622237.novalocal python3[7130]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 16:38:41 np0005622237.novalocal python3[7156]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 16:38:41 np0005622237.novalocal sudo[7234]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqqytuamvilgzkfkjgjtdamkbnfmdgdu ; /usr/bin/python3'
Feb 17 16:38:41 np0005622237.novalocal sudo[7234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:38:41 np0005622237.novalocal python3[7236]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 17 16:38:41 np0005622237.novalocal sudo[7234]: pam_unix(sudo:session): session closed for user root
Feb 17 16:38:42 np0005622237.novalocal sudo[7307]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqurbcsewwrvtwirttlivqpiklftggzx ; /usr/bin/python3'
Feb 17 16:38:42 np0005622237.novalocal sudo[7307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:38:42 np0005622237.novalocal python3[7309]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1771346321.5385835-273-66912185627853/source _original_basename=tmp67bbhjx1 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:38:42 np0005622237.novalocal sudo[7307]: pam_unix(sudo:session): session closed for user root
Feb 17 16:38:43 np0005622237.novalocal sudo[7358]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmiogdhqhwlpyjvcjhjhdnpqlmzurmgu ; /usr/bin/python3'
Feb 17 16:38:43 np0005622237.novalocal sudo[7358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:38:43 np0005622237.novalocal python3[7360]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ec2-ffbe-f759-ff6b-00000000001d-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 16:38:43 np0005622237.novalocal sudo[7358]: pam_unix(sudo:session): session closed for user root
Feb 17 16:38:44 np0005622237.novalocal python3[7388]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-f759-ff6b-00000000001e-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Feb 17 16:38:45 np0005622237.novalocal python3[7416]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:39:05 np0005622237.novalocal sudo[7440]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijdqlrxlyogsolevtddqettqlbyamgig ; /usr/bin/python3'
Feb 17 16:39:05 np0005622237.novalocal sudo[7440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:39:05 np0005622237.novalocal python3[7442]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:39:05 np0005622237.novalocal sudo[7440]: pam_unix(sudo:session): session closed for user root
Feb 17 16:39:05 np0005622237.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 17 16:39:40 np0005622237.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Feb 17 16:39:40 np0005622237.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Feb 17 16:39:40 np0005622237.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Feb 17 16:39:40 np0005622237.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Feb 17 16:39:40 np0005622237.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Feb 17 16:39:40 np0005622237.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Feb 17 16:39:40 np0005622237.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Feb 17 16:39:40 np0005622237.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Feb 17 16:39:40 np0005622237.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Feb 17 16:39:40 np0005622237.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Feb 17 16:39:40 np0005622237.novalocal NetworkManager[868]: <info>  [1771346380.7796] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 17 16:39:40 np0005622237.novalocal systemd-udevd[7446]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 16:39:40 np0005622237.novalocal NetworkManager[868]: <info>  [1771346380.7962] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 17 16:39:40 np0005622237.novalocal NetworkManager[868]: <info>  [1771346380.8001] settings: (eth1): created default wired connection 'Wired connection 1'
Feb 17 16:39:40 np0005622237.novalocal NetworkManager[868]: <info>  [1771346380.8007] device (eth1): carrier: link connected
Feb 17 16:39:40 np0005622237.novalocal NetworkManager[868]: <info>  [1771346380.8011] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Feb 17 16:39:40 np0005622237.novalocal NetworkManager[868]: <info>  [1771346380.8021] policy: auto-activating connection 'Wired connection 1' (67d0a0b6-42d4-3838-8d12-c8bde1f2493b)
Feb 17 16:39:40 np0005622237.novalocal NetworkManager[868]: <info>  [1771346380.8027] device (eth1): Activation: starting connection 'Wired connection 1' (67d0a0b6-42d4-3838-8d12-c8bde1f2493b)
Feb 17 16:39:40 np0005622237.novalocal NetworkManager[868]: <info>  [1771346380.8029] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 17 16:39:40 np0005622237.novalocal NetworkManager[868]: <info>  [1771346380.8034] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 17 16:39:40 np0005622237.novalocal NetworkManager[868]: <info>  [1771346380.8040] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 17 16:39:40 np0005622237.novalocal NetworkManager[868]: <info>  [1771346380.8046] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 17 16:39:41 np0005622237.novalocal python3[7472]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ec2-ffbe-e906-1717-0000000000fc-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 16:39:43 np0005622237.novalocal sshd-session[7475]: Received disconnect from 45.148.10.151 port 62432:11:  [preauth]
Feb 17 16:39:43 np0005622237.novalocal sshd-session[7475]: Disconnected from authenticating user root 45.148.10.151 port 62432 [preauth]
Feb 17 16:39:48 np0005622237.novalocal sudo[7552]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opsadgmguncwzrymnsxehuovwpmmayvy ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 17 16:39:48 np0005622237.novalocal sudo[7552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:39:48 np0005622237.novalocal python3[7554]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 17 16:39:48 np0005622237.novalocal sudo[7552]: pam_unix(sudo:session): session closed for user root
Feb 17 16:39:49 np0005622237.novalocal sudo[7625]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unccyetjkhslkmxyxehtwywdvnnqtjga ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 17 16:39:49 np0005622237.novalocal sudo[7625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:39:49 np0005622237.novalocal python3[7627]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771346388.5097585-102-163121514090306/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=604551f4a8ac0f162888c4f51d57966890478809 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:39:49 np0005622237.novalocal sudo[7625]: pam_unix(sudo:session): session closed for user root
Feb 17 16:39:49 np0005622237.novalocal sudo[7675]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njzdlwphorlxodkvkopgcxutroascaqa ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 17 16:39:49 np0005622237.novalocal sudo[7675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:39:50 np0005622237.novalocal python3[7677]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 17 16:39:50 np0005622237.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb 17 16:39:50 np0005622237.novalocal systemd[1]: Stopped Network Manager Wait Online.
Feb 17 16:39:50 np0005622237.novalocal systemd[1]: Stopping Network Manager Wait Online...
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[868]: <info>  [1771346390.0382] caught SIGTERM, shutting down normally.
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[868]: <info>  [1771346390.0389] dhcp4 (eth0): canceled DHCP transaction
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[868]: <info>  [1771346390.0389] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[868]: <info>  [1771346390.0389] dhcp4 (eth0): state changed no lease
Feb 17 16:39:50 np0005622237.novalocal systemd[1]: Stopping Network Manager...
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[868]: <info>  [1771346390.0392] manager: NetworkManager state is now CONNECTING
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[868]: <info>  [1771346390.0481] dhcp4 (eth1): canceled DHCP transaction
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[868]: <info>  [1771346390.0481] dhcp4 (eth1): state changed no lease
Feb 17 16:39:50 np0005622237.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[868]: <info>  [1771346390.0521] exiting (success)
Feb 17 16:39:50 np0005622237.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Feb 17 16:39:50 np0005622237.novalocal systemd[1]: Stopped Network Manager.
Feb 17 16:39:50 np0005622237.novalocal systemd[1]: Starting Network Manager...
Feb 17 16:39:50 np0005622237.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.0887] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:1da25986-a7c7-4f2a-b760-e7b6d26f1215)
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.0889] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.0936] manager[0x55fada8d8000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 17 16:39:50 np0005622237.novalocal systemd[1]: Starting Hostname Service...
Feb 17 16:39:50 np0005622237.novalocal systemd[1]: Started Hostname Service.
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1645] hostname: hostname: using hostnamed
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1647] hostname: static hostname changed from (none) to "np0005622237.novalocal"
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1652] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1657] manager[0x55fada8d8000]: rfkill: Wi-Fi hardware radio set enabled
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1657] manager[0x55fada8d8000]: rfkill: WWAN hardware radio set enabled
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1696] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1696] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1697] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1698] manager: Networking is enabled by state file
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1701] settings: Loaded settings plugin: keyfile (internal)
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1708] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1744] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1756] dhcp: init: Using DHCP client 'internal'
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1761] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1768] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1775] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1786] device (lo): Activation: starting connection 'lo' (408cebb5-d164-4b54-9d84-326ea0ceda94)
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1794] device (eth0): carrier: link connected
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1800] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1807] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1808] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1818] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1828] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1835] device (eth1): carrier: link connected
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1841] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1849] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (67d0a0b6-42d4-3838-8d12-c8bde1f2493b) (indicated)
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1849] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1857] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1868] device (eth1): Activation: starting connection 'Wired connection 1' (67d0a0b6-42d4-3838-8d12-c8bde1f2493b)
Feb 17 16:39:50 np0005622237.novalocal systemd[1]: Started Network Manager.
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1875] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1881] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1885] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1888] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1892] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1897] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1902] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1907] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1911] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1921] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1926] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1936] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1939] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1958] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1964] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1972] device (lo): Activation: successful, device activated.
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1983] dhcp4 (eth0): state changed new lease, address=38.102.83.53
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.1993] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 17 16:39:50 np0005622237.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.2078] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.2104] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.2106] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.2112] manager: NetworkManager state is now CONNECTED_SITE
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.2118] device (eth0): Activation: successful, device activated.
Feb 17 16:39:50 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346390.2126] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 17 16:39:50 np0005622237.novalocal sudo[7675]: pam_unix(sudo:session): session closed for user root
Feb 17 16:39:50 np0005622237.novalocal python3[7762]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ec2-ffbe-e906-1717-0000000000a7-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 16:40:00 np0005622237.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 17 16:40:20 np0005622237.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 17 16:40:30 np0005622237.novalocal systemd[4801]: Starting Mark boot as successful...
Feb 17 16:40:30 np0005622237.novalocal systemd[4801]: Finished Mark boot as successful.
Feb 17 16:40:35 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346435.6359] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 17 16:40:35 np0005622237.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 17 16:40:35 np0005622237.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 17 16:40:35 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346435.6671] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 17 16:40:35 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346435.6675] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 17 16:40:35 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346435.6688] device (eth1): Activation: successful, device activated.
Feb 17 16:40:35 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346435.6699] manager: startup complete
Feb 17 16:40:35 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346435.6702] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Feb 17 16:40:35 np0005622237.novalocal NetworkManager[7685]: <warn>  [1771346435.6712] device (eth1): Activation: failed for connection 'Wired connection 1'
Feb 17 16:40:35 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346435.6726] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Feb 17 16:40:35 np0005622237.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 17 16:40:35 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346435.6811] dhcp4 (eth1): canceled DHCP transaction
Feb 17 16:40:35 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346435.6811] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 17 16:40:35 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346435.6812] dhcp4 (eth1): state changed no lease
Feb 17 16:40:35 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346435.6828] policy: auto-activating connection 'ci-private-network' (90d80dee-6d07-52d6-8d2c-113ca6c279fe)
Feb 17 16:40:35 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346435.6835] device (eth1): Activation: starting connection 'ci-private-network' (90d80dee-6d07-52d6-8d2c-113ca6c279fe)
Feb 17 16:40:35 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346435.6836] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 17 16:40:35 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346435.6839] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 17 16:40:35 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346435.6851] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 17 16:40:35 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346435.6861] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 17 16:40:35 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346435.6907] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 17 16:40:35 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346435.6910] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 17 16:40:35 np0005622237.novalocal NetworkManager[7685]: <info>  [1771346435.6916] device (eth1): Activation: successful, device activated.
Feb 17 16:40:45 np0005622237.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 17 16:40:47 np0005622237.novalocal sudo[7866]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmciwvnxllbudmalrcfrlbxtkczrpxfy ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 17 16:40:47 np0005622237.novalocal sudo[7866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:40:47 np0005622237.novalocal python3[7868]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 17 16:40:47 np0005622237.novalocal sudo[7866]: pam_unix(sudo:session): session closed for user root
Feb 17 16:40:47 np0005622237.novalocal sudo[7939]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbntkratjynrtkwiqdgcvkkwxlyarogh ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 17 16:40:47 np0005622237.novalocal sudo[7939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:40:47 np0005622237.novalocal python3[7941]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771346447.1523452-259-234067684631622/source _original_basename=tmp5wlqfu1l follow=False checksum=9e80c361a647f306898e8c40e8e1adf6d6032022 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:40:47 np0005622237.novalocal sudo[7939]: pam_unix(sudo:session): session closed for user root
Feb 17 16:41:47 np0005622237.novalocal sshd-session[4810]: Received disconnect from 38.102.83.114 port 58878:11: disconnected by user
Feb 17 16:41:47 np0005622237.novalocal sshd-session[4810]: Disconnected from user zuul 38.102.83.114 port 58878
Feb 17 16:41:47 np0005622237.novalocal sshd-session[4797]: pam_unix(sshd:session): session closed for user zuul
Feb 17 16:41:47 np0005622237.novalocal systemd-logind[806]: Session 1 logged out. Waiting for processes to exit.
Feb 17 16:42:05 np0005622237.novalocal sshd-session[7966]: userauth_pubkey: signature algorithm ssh-rsa not in PubkeyAcceptedAlgorithms [preauth]
Feb 17 16:42:14 np0005622237.novalocal sshd-session[7966]: Connection closed by authenticating user root 139.19.117.131 port 51514 [preauth]
Feb 17 16:43:30 np0005622237.novalocal systemd[4801]: Created slice User Background Tasks Slice.
Feb 17 16:43:30 np0005622237.novalocal systemd[4801]: Starting Cleanup of User's Temporary Files and Directories...
Feb 17 16:43:30 np0005622237.novalocal systemd[4801]: Finished Cleanup of User's Temporary Files and Directories.
Feb 17 16:47:16 np0005622237.novalocal sshd-session[7972]: Received disconnect from 45.148.10.147 port 49320:11:  [preauth]
Feb 17 16:47:16 np0005622237.novalocal sshd-session[7972]: Disconnected from authenticating user root 45.148.10.147 port 49320 [preauth]
Feb 17 16:48:49 np0005622237.novalocal sshd-session[7976]: Accepted publickey for zuul from 38.102.83.114 port 38654 ssh2: RSA SHA256:DGSxqLep0mpmszTItk9K4n83Dfv6bNg6Q/zgJfdGqpA
Feb 17 16:48:49 np0005622237.novalocal systemd-logind[806]: New session 3 of user zuul.
Feb 17 16:48:50 np0005622237.novalocal systemd[1]: Started Session 3 of User zuul.
Feb 17 16:48:50 np0005622237.novalocal sshd-session[7976]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 16:48:50 np0005622237.novalocal sudo[8003]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oojpfmfjygrjpfylnrikkterddloxsui ; /usr/bin/python3'
Feb 17 16:48:50 np0005622237.novalocal sudo[8003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:48:50 np0005622237.novalocal python3[8005]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-0b34-a641-00000000216d-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 16:48:50 np0005622237.novalocal sudo[8003]: pam_unix(sudo:session): session closed for user root
Feb 17 16:48:50 np0005622237.novalocal sudo[8032]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqbirvvobywnrjgeuqegwnlfkmoghxnj ; /usr/bin/python3'
Feb 17 16:48:50 np0005622237.novalocal sudo[8032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:48:50 np0005622237.novalocal python3[8034]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:48:50 np0005622237.novalocal sudo[8032]: pam_unix(sudo:session): session closed for user root
Feb 17 16:48:50 np0005622237.novalocal sudo[8058]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrlwttfhqjqtioqqwnbcgjmuevwbsqxl ; /usr/bin/python3'
Feb 17 16:48:50 np0005622237.novalocal sudo[8058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:48:50 np0005622237.novalocal python3[8060]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:48:50 np0005622237.novalocal sudo[8058]: pam_unix(sudo:session): session closed for user root
Feb 17 16:48:50 np0005622237.novalocal sudo[8084]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfcnzsjrqzfliivrqhxgubagsjpvykmy ; /usr/bin/python3'
Feb 17 16:48:50 np0005622237.novalocal sudo[8084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:48:51 np0005622237.novalocal python3[8086]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:48:51 np0005622237.novalocal sudo[8084]: pam_unix(sudo:session): session closed for user root
Feb 17 16:48:51 np0005622237.novalocal sudo[8110]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okltablsracxbsnkzzqoiuspvglruafb ; /usr/bin/python3'
Feb 17 16:48:51 np0005622237.novalocal sudo[8110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:48:51 np0005622237.novalocal python3[8112]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:48:51 np0005622237.novalocal sudo[8110]: pam_unix(sudo:session): session closed for user root
Feb 17 16:48:51 np0005622237.novalocal sudo[8136]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awcrorzsjnhkriifdksskfbbtthsnzqb ; /usr/bin/python3'
Feb 17 16:48:51 np0005622237.novalocal sudo[8136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:48:51 np0005622237.novalocal python3[8138]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:48:51 np0005622237.novalocal sudo[8136]: pam_unix(sudo:session): session closed for user root
Feb 17 16:48:52 np0005622237.novalocal sudo[8214]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jryoqlrywsahkqoieuvxwjasuvqofeeh ; /usr/bin/python3'
Feb 17 16:48:52 np0005622237.novalocal sudo[8214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:48:52 np0005622237.novalocal python3[8216]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 17 16:48:52 np0005622237.novalocal sudo[8214]: pam_unix(sudo:session): session closed for user root
Feb 17 16:48:52 np0005622237.novalocal sudo[8287]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfftyqcctdmosfecpvfzbnbwlqkervqo ; /usr/bin/python3'
Feb 17 16:48:52 np0005622237.novalocal sudo[8287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:48:52 np0005622237.novalocal python3[8289]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771346932.0684478-503-183554555813890/source _original_basename=tmp1b2tfowt follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:48:52 np0005622237.novalocal sudo[8287]: pam_unix(sudo:session): session closed for user root
Feb 17 16:48:53 np0005622237.novalocal sudo[8337]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rihudvrmykbpqkucxqrqawwvftumygji ; /usr/bin/python3'
Feb 17 16:48:53 np0005622237.novalocal sudo[8337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:48:53 np0005622237.novalocal python3[8339]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 17 16:48:53 np0005622237.novalocal systemd[1]: Reloading.
Feb 17 16:48:53 np0005622237.novalocal systemd-rc-local-generator[8356]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 16:48:53 np0005622237.novalocal sudo[8337]: pam_unix(sudo:session): session closed for user root
Feb 17 16:48:55 np0005622237.novalocal sudo[8399]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zciywsvzvdammktmslyyshgfmmpfgdfz ; /usr/bin/python3'
Feb 17 16:48:55 np0005622237.novalocal sudo[8399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:48:55 np0005622237.novalocal python3[8401]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Feb 17 16:48:55 np0005622237.novalocal sudo[8399]: pam_unix(sudo:session): session closed for user root
Feb 17 16:48:55 np0005622237.novalocal sudo[8425]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhmdyiehzdkamruntvxsnccvefkobcba ; /usr/bin/python3'
Feb 17 16:48:55 np0005622237.novalocal sudo[8425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:48:55 np0005622237.novalocal python3[8427]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 16:48:55 np0005622237.novalocal sudo[8425]: pam_unix(sudo:session): session closed for user root
Feb 17 16:48:55 np0005622237.novalocal sudo[8453]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bubmvfarchphclfndbwvkjemjspsbaxe ; /usr/bin/python3'
Feb 17 16:48:55 np0005622237.novalocal sudo[8453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:48:55 np0005622237.novalocal python3[8455]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 16:48:55 np0005622237.novalocal sudo[8453]: pam_unix(sudo:session): session closed for user root
Feb 17 16:48:56 np0005622237.novalocal sudo[8481]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sppeiiekykigsnpwrlyoqnvjlozvuyfe ; /usr/bin/python3'
Feb 17 16:48:56 np0005622237.novalocal sudo[8481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:48:56 np0005622237.novalocal python3[8483]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 16:48:56 np0005622237.novalocal sudo[8481]: pam_unix(sudo:session): session closed for user root
Feb 17 16:48:56 np0005622237.novalocal sudo[8509]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrjapigygmaareulhfjfzrzstbxqltkr ; /usr/bin/python3'
Feb 17 16:48:56 np0005622237.novalocal sudo[8509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:48:56 np0005622237.novalocal python3[8511]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 16:48:56 np0005622237.novalocal sudo[8509]: pam_unix(sudo:session): session closed for user root
Feb 17 16:48:56 np0005622237.novalocal python3[8538]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-0b34-a641-000000002174-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 16:48:57 np0005622237.novalocal python3[8568]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 17 16:48:59 np0005622237.novalocal sshd-session[7979]: Connection closed by 38.102.83.114 port 38654
Feb 17 16:48:59 np0005622237.novalocal sshd-session[7976]: pam_unix(sshd:session): session closed for user zuul
Feb 17 16:48:59 np0005622237.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Feb 17 16:48:59 np0005622237.novalocal systemd[1]: session-3.scope: Consumed 3.699s CPU time.
Feb 17 16:48:59 np0005622237.novalocal systemd-logind[806]: Session 3 logged out. Waiting for processes to exit.
Feb 17 16:48:59 np0005622237.novalocal systemd-logind[806]: Removed session 3.
Feb 17 16:49:01 np0005622237.novalocal sshd-session[8574]: Accepted publickey for zuul from 38.102.83.114 port 56174 ssh2: RSA SHA256:DGSxqLep0mpmszTItk9K4n83Dfv6bNg6Q/zgJfdGqpA
Feb 17 16:49:01 np0005622237.novalocal systemd-logind[806]: New session 4 of user zuul.
Feb 17 16:49:01 np0005622237.novalocal systemd[1]: Started Session 4 of User zuul.
Feb 17 16:49:01 np0005622237.novalocal sshd-session[8574]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 16:49:01 np0005622237.novalocal sudo[8601]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjexawjgszohsqjbwzkohdyuitlpinwv ; /usr/bin/python3'
Feb 17 16:49:01 np0005622237.novalocal sudo[8601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:49:01 np0005622237.novalocal python3[8603]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 17 16:49:08 np0005622237.novalocal setsebool[8639]: The virt_use_nfs policy boolean was changed to 1 by root
Feb 17 16:49:08 np0005622237.novalocal setsebool[8639]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Feb 17 16:49:18 np0005622237.novalocal kernel: SELinux:  Converting 385 SID table entries...
Feb 17 16:49:18 np0005622237.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 17 16:49:18 np0005622237.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 17 16:49:18 np0005622237.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 17 16:49:18 np0005622237.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 17 16:49:18 np0005622237.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 17 16:49:18 np0005622237.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 17 16:49:18 np0005622237.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 17 16:49:29 np0005622237.novalocal kernel: SELinux:  Converting 388 SID table entries...
Feb 17 16:49:29 np0005622237.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 17 16:49:29 np0005622237.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 17 16:49:29 np0005622237.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 17 16:49:29 np0005622237.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 17 16:49:29 np0005622237.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 17 16:49:29 np0005622237.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 17 16:49:29 np0005622237.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 17 16:49:48 np0005622237.novalocal dbus-broker-launch[789]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb 17 16:49:48 np0005622237.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 17 16:49:48 np0005622237.novalocal systemd[1]: Starting man-db-cache-update.service...
Feb 17 16:49:48 np0005622237.novalocal systemd[1]: Reloading.
Feb 17 16:49:48 np0005622237.novalocal systemd-rc-local-generator[9426]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 16:49:48 np0005622237.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Feb 17 16:49:51 np0005622237.novalocal sudo[8601]: pam_unix(sudo:session): session closed for user root
Feb 17 16:50:01 np0005622237.novalocal python3[16422]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ec2-ffbe-b208-efae-00000000000a-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 16:50:02 np0005622237.novalocal kernel: evm: overlay not supported
Feb 17 16:50:02 np0005622237.novalocal systemd[4801]: Starting D-Bus User Message Bus...
Feb 17 16:50:02 np0005622237.novalocal dbus-broker-launch[16923]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Feb 17 16:50:02 np0005622237.novalocal dbus-broker-launch[16923]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Feb 17 16:50:02 np0005622237.novalocal systemd[4801]: Started D-Bus User Message Bus.
Feb 17 16:50:02 np0005622237.novalocal dbus-broker-lau[16923]: Ready
Feb 17 16:50:02 np0005622237.novalocal systemd[4801]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb 17 16:50:02 np0005622237.novalocal systemd[4801]: Created slice Slice /user.
Feb 17 16:50:02 np0005622237.novalocal systemd[4801]: podman-16853.scope: unit configures an IP firewall, but not running as root.
Feb 17 16:50:02 np0005622237.novalocal systemd[4801]: (This warning is only shown for the first unit using IP firewalling.)
Feb 17 16:50:02 np0005622237.novalocal systemd[4801]: Started podman-16853.scope.
Feb 17 16:50:03 np0005622237.novalocal systemd[4801]: Started podman-pause-236e386d.scope.
Feb 17 16:50:03 np0005622237.novalocal sudo[17257]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlrsearpryxnzriympqdeakckrwjylrp ; /usr/bin/python3'
Feb 17 16:50:03 np0005622237.novalocal sudo[17257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:50:03 np0005622237.novalocal python3[17272]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.145:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.145:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:50:03 np0005622237.novalocal python3[17272]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Feb 17 16:50:03 np0005622237.novalocal sudo[17257]: pam_unix(sudo:session): session closed for user root
Feb 17 16:50:03 np0005622237.novalocal sshd-session[8577]: Connection closed by 38.102.83.114 port 56174
Feb 17 16:50:03 np0005622237.novalocal sshd-session[8574]: pam_unix(sshd:session): session closed for user zuul
Feb 17 16:50:03 np0005622237.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Feb 17 16:50:03 np0005622237.novalocal systemd[1]: session-4.scope: Consumed 42.570s CPU time.
Feb 17 16:50:03 np0005622237.novalocal systemd-logind[806]: Session 4 logged out. Waiting for processes to exit.
Feb 17 16:50:03 np0005622237.novalocal systemd-logind[806]: Removed session 4.
Feb 17 16:50:23 np0005622237.novalocal sshd-session[24594]: Connection closed by 38.102.83.51 port 40412 [preauth]
Feb 17 16:50:23 np0005622237.novalocal sshd-session[24591]: Unable to negotiate with 38.102.83.51 port 40420: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Feb 17 16:50:23 np0005622237.novalocal sshd-session[24596]: Unable to negotiate with 38.102.83.51 port 40430: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Feb 17 16:50:23 np0005622237.novalocal sshd-session[24598]: Unable to negotiate with 38.102.83.51 port 40428: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Feb 17 16:50:23 np0005622237.novalocal sshd-session[24600]: Connection closed by 38.102.83.51 port 40406 [preauth]
Feb 17 16:50:27 np0005622237.novalocal sshd-session[26535]: Accepted publickey for zuul from 38.102.83.114 port 44114 ssh2: RSA SHA256:DGSxqLep0mpmszTItk9K4n83Dfv6bNg6Q/zgJfdGqpA
Feb 17 16:50:27 np0005622237.novalocal systemd-logind[806]: New session 5 of user zuul.
Feb 17 16:50:27 np0005622237.novalocal systemd[1]: Started Session 5 of User zuul.
Feb 17 16:50:27 np0005622237.novalocal sshd-session[26535]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 16:50:27 np0005622237.novalocal python3[26630]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMmqDzAOGpNIjyQcczmYEOOs31q6FAGnqgJw5DMFM/Zt4YLBAYltn70wjwk9uCXiD1zmHBovLENEKprg3WRcLlI= zuul@np0005622236.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:50:27 np0005622237.novalocal sudo[26769]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkxsxyxduobhpyugpndmjnsbzxaxcshp ; /usr/bin/python3'
Feb 17 16:50:27 np0005622237.novalocal sudo[26769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:50:27 np0005622237.novalocal python3[26777]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMmqDzAOGpNIjyQcczmYEOOs31q6FAGnqgJw5DMFM/Zt4YLBAYltn70wjwk9uCXiD1zmHBovLENEKprg3WRcLlI= zuul@np0005622236.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:50:27 np0005622237.novalocal sudo[26769]: pam_unix(sudo:session): session closed for user root
Feb 17 16:50:28 np0005622237.novalocal sudo[27056]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyaasyxkxaayfgrjfmgfacnpssgicjpu ; /usr/bin/python3'
Feb 17 16:50:28 np0005622237.novalocal sudo[27056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:50:28 np0005622237.novalocal python3[27065]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005622237.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Feb 17 16:50:28 np0005622237.novalocal useradd[27124]: new group: name=cloud-admin, GID=1002
Feb 17 16:50:28 np0005622237.novalocal useradd[27124]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Feb 17 16:50:28 np0005622237.novalocal sudo[27056]: pam_unix(sudo:session): session closed for user root
Feb 17 16:50:28 np0005622237.novalocal sudo[27248]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yftabnybpvyjzpmifhqniyjwcxijbqwn ; /usr/bin/python3'
Feb 17 16:50:28 np0005622237.novalocal sudo[27248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:50:29 np0005622237.novalocal python3[27256]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMmqDzAOGpNIjyQcczmYEOOs31q6FAGnqgJw5DMFM/Zt4YLBAYltn70wjwk9uCXiD1zmHBovLENEKprg3WRcLlI= zuul@np0005622236.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 17 16:50:29 np0005622237.novalocal sudo[27248]: pam_unix(sudo:session): session closed for user root
Feb 17 16:50:29 np0005622237.novalocal sudo[27490]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ziergmecimvqiwyhvzuckfgqfmpygllt ; /usr/bin/python3'
Feb 17 16:50:29 np0005622237.novalocal sudo[27490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:50:29 np0005622237.novalocal python3[27499]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 17 16:50:29 np0005622237.novalocal sudo[27490]: pam_unix(sudo:session): session closed for user root
Feb 17 16:50:29 np0005622237.novalocal sudo[27744]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thggqboexbzqrxyrpyuolucpdoedkhjq ; /usr/bin/python3'
Feb 17 16:50:29 np0005622237.novalocal sudo[27744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:50:29 np0005622237.novalocal python3[27750]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771347029.1801603-135-166365874420596/source _original_basename=tmpkfd0hncz follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:50:29 np0005622237.novalocal sudo[27744]: pam_unix(sudo:session): session closed for user root
Feb 17 16:50:30 np0005622237.novalocal sudo[28005]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hilebiuwiqtoypkrroecwkgtkytdbglp ; /usr/bin/python3'
Feb 17 16:50:30 np0005622237.novalocal sudo[28005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:50:30 np0005622237.novalocal python3[28013]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Feb 17 16:50:30 np0005622237.novalocal systemd[1]: Starting Hostname Service...
Feb 17 16:50:30 np0005622237.novalocal systemd[1]: Started Hostname Service.
Feb 17 16:50:30 np0005622237.novalocal systemd-hostnamed[28093]: Changed pretty hostname to 'compute-0'
Feb 17 16:50:30 compute-0 systemd-hostnamed[28093]: Hostname set to <compute-0> (static)
Feb 17 16:50:30 compute-0 NetworkManager[7685]: <info>  [1771347030.8200] hostname: static hostname changed from "np0005622237.novalocal" to "compute-0"
Feb 17 16:50:30 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 17 16:50:30 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 17 16:50:30 compute-0 sudo[28005]: pam_unix(sudo:session): session closed for user root
Feb 17 16:50:31 compute-0 sshd-session[26583]: Connection closed by 38.102.83.114 port 44114
Feb 17 16:50:31 compute-0 sshd-session[26535]: pam_unix(sshd:session): session closed for user zuul
Feb 17 16:50:31 compute-0 systemd[1]: session-5.scope: Deactivated successfully.
Feb 17 16:50:31 compute-0 systemd[1]: session-5.scope: Consumed 2.129s CPU time.
Feb 17 16:50:31 compute-0 systemd-logind[806]: Session 5 logged out. Waiting for processes to exit.
Feb 17 16:50:31 compute-0 systemd-logind[806]: Removed session 5.
Feb 17 16:50:39 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 17 16:50:39 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 17 16:50:39 compute-0 systemd[1]: man-db-cache-update.service: Consumed 48.602s CPU time.
Feb 17 16:50:39 compute-0 systemd[1]: run-rcb1b9347f9284c128851deb727c97139.service: Deactivated successfully.
Feb 17 16:50:40 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 17 16:51:00 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 17 16:52:30 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Feb 17 16:52:30 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Feb 17 16:52:30 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Feb 17 16:52:30 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Feb 17 16:54:48 compute-0 sshd-session[30498]: Received disconnect from 91.224.92.78 port 57574:11:  [preauth]
Feb 17 16:54:48 compute-0 sshd-session[30498]: Disconnected from authenticating user root 91.224.92.78 port 57574 [preauth]
Feb 17 16:55:49 compute-0 sshd-session[30500]: Accepted publickey for zuul from 38.102.83.51 port 45826 ssh2: RSA SHA256:DGSxqLep0mpmszTItk9K4n83Dfv6bNg6Q/zgJfdGqpA
Feb 17 16:55:49 compute-0 systemd-logind[806]: New session 6 of user zuul.
Feb 17 16:55:49 compute-0 systemd[1]: Started Session 6 of User zuul.
Feb 17 16:55:49 compute-0 sshd-session[30500]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 16:55:49 compute-0 python3[30576]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 16:55:50 compute-0 sudo[30690]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-accygqhtumusljrmitvsywgkpaybnbkr ; /usr/bin/python3'
Feb 17 16:55:50 compute-0 sudo[30690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:55:51 compute-0 python3[30692]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 17 16:55:51 compute-0 sudo[30690]: pam_unix(sudo:session): session closed for user root
Feb 17 16:55:51 compute-0 sudo[30763]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afzliyoepybvwcehofbjjxfnsnxihtco ; /usr/bin/python3'
Feb 17 16:55:51 compute-0 sudo[30763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:55:51 compute-0 python3[30765]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771347350.8363528-36259-23178341823715/source mode=0755 _original_basename=delorean.repo follow=False checksum=cc4ab4695da8ec58c451521a3dd2f41014af145d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:55:51 compute-0 sudo[30763]: pam_unix(sudo:session): session closed for user root
Feb 17 16:55:51 compute-0 sudo[30789]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aojuxkucbsjbwcxdisbdcouctxkbgqml ; /usr/bin/python3'
Feb 17 16:55:51 compute-0 sudo[30789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:55:51 compute-0 python3[30791]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 17 16:55:51 compute-0 sudo[30789]: pam_unix(sudo:session): session closed for user root
Feb 17 16:55:51 compute-0 sudo[30862]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmxxxktzghhdkodldltnuxsmftpahttb ; /usr/bin/python3'
Feb 17 16:55:52 compute-0 sudo[30862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:55:52 compute-0 python3[30864]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771347350.8363528-36259-23178341823715/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:55:52 compute-0 sudo[30862]: pam_unix(sudo:session): session closed for user root
Feb 17 16:55:52 compute-0 sudo[30888]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uornchjyerojbhcwqkebmgqkzhdgapan ; /usr/bin/python3'
Feb 17 16:55:52 compute-0 sudo[30888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:55:52 compute-0 python3[30890]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 17 16:55:52 compute-0 sudo[30888]: pam_unix(sudo:session): session closed for user root
Feb 17 16:55:52 compute-0 sudo[30961]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blglsfgxjpkskmavavcltfzupwxxsijw ; /usr/bin/python3'
Feb 17 16:55:52 compute-0 sudo[30961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:55:52 compute-0 python3[30963]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771347350.8363528-36259-23178341823715/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:55:52 compute-0 sudo[30961]: pam_unix(sudo:session): session closed for user root
Feb 17 16:55:52 compute-0 sudo[30987]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyughzpnfennsgumtctehpwzijyswxmc ; /usr/bin/python3'
Feb 17 16:55:52 compute-0 sudo[30987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:55:53 compute-0 python3[30989]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 17 16:55:53 compute-0 sudo[30987]: pam_unix(sudo:session): session closed for user root
Feb 17 16:55:53 compute-0 sudo[31060]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enkcgwxbskigbsptjjvrmfecdvozurhi ; /usr/bin/python3'
Feb 17 16:55:53 compute-0 sudo[31060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:55:53 compute-0 python3[31062]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771347350.8363528-36259-23178341823715/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:55:53 compute-0 sudo[31060]: pam_unix(sudo:session): session closed for user root
Feb 17 16:55:53 compute-0 sudo[31086]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsrvjywcyatulaucwtnadaguywdcqmmq ; /usr/bin/python3'
Feb 17 16:55:53 compute-0 sudo[31086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:55:53 compute-0 python3[31088]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 17 16:55:53 compute-0 sudo[31086]: pam_unix(sudo:session): session closed for user root
Feb 17 16:55:53 compute-0 sudo[31159]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmdqgsozdvsdzjugwivzjgbhjqmapfyn ; /usr/bin/python3'
Feb 17 16:55:53 compute-0 sudo[31159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:55:53 compute-0 python3[31161]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771347350.8363528-36259-23178341823715/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:55:53 compute-0 sudo[31159]: pam_unix(sudo:session): session closed for user root
Feb 17 16:55:54 compute-0 sudo[31185]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzmrghilpqaqiamuepqxtgnufecwugai ; /usr/bin/python3'
Feb 17 16:55:54 compute-0 sudo[31185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:55:54 compute-0 python3[31187]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 17 16:55:54 compute-0 sudo[31185]: pam_unix(sudo:session): session closed for user root
Feb 17 16:55:54 compute-0 sudo[31258]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsgtibeezvnjlmlwtqakamwvmsnfocwi ; /usr/bin/python3'
Feb 17 16:55:54 compute-0 sudo[31258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:55:54 compute-0 python3[31260]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771347350.8363528-36259-23178341823715/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:55:54 compute-0 sudo[31258]: pam_unix(sudo:session): session closed for user root
Feb 17 16:55:54 compute-0 sudo[31284]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixqlohbfxtkylmutghddnveegpsvqete ; /usr/bin/python3'
Feb 17 16:55:54 compute-0 sudo[31284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:55:55 compute-0 python3[31286]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 17 16:55:55 compute-0 sudo[31284]: pam_unix(sudo:session): session closed for user root
Feb 17 16:55:55 compute-0 sudo[31357]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkgvmelkimzbmhprzwslkwjdcjtghxft ; /usr/bin/python3'
Feb 17 16:55:55 compute-0 sudo[31357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 16:55:55 compute-0 python3[31359]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771347350.8363528-36259-23178341823715/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=362a603578148d54e8cd25942b88d7f471cc677a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 16:55:55 compute-0 sudo[31357]: pam_unix(sudo:session): session closed for user root
Feb 17 16:55:57 compute-0 sshd-session[31386]: Connection closed by 192.168.122.11 port 53590 [preauth]
Feb 17 16:55:57 compute-0 sshd-session[31384]: Connection closed by 192.168.122.11 port 53596 [preauth]
Feb 17 16:55:57 compute-0 sshd-session[31388]: Unable to negotiate with 192.168.122.11 port 53600: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Feb 17 16:55:57 compute-0 sshd-session[31387]: Unable to negotiate with 192.168.122.11 port 53616: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Feb 17 16:55:57 compute-0 sshd-session[31385]: Unable to negotiate with 192.168.122.11 port 53620: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Feb 17 16:56:06 compute-0 python3[31417]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 16:59:34 compute-0 sshd-session[31420]: Connection closed by 115.190.249.18 port 36888
Feb 17 17:01:01 compute-0 CROND[31423]: (root) CMD (run-parts /etc/cron.hourly)
Feb 17 17:01:01 compute-0 run-parts[31426]: (/etc/cron.hourly) starting 0anacron
Feb 17 17:01:01 compute-0 anacron[31434]: Anacron started on 2026-02-17
Feb 17 17:01:01 compute-0 anacron[31434]: Will run job `cron.daily' in 45 min.
Feb 17 17:01:01 compute-0 anacron[31434]: Will run job `cron.weekly' in 65 min.
Feb 17 17:01:01 compute-0 anacron[31434]: Will run job `cron.monthly' in 85 min.
Feb 17 17:01:01 compute-0 anacron[31434]: Jobs will be executed sequentially
Feb 17 17:01:01 compute-0 run-parts[31436]: (/etc/cron.hourly) finished 0anacron
Feb 17 17:01:01 compute-0 CROND[31422]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 17 17:01:06 compute-0 sshd-session[30503]: Received disconnect from 38.102.83.51 port 45826:11: disconnected by user
Feb 17 17:01:06 compute-0 sshd-session[30503]: Disconnected from user zuul 38.102.83.51 port 45826
Feb 17 17:01:06 compute-0 sshd-session[30500]: pam_unix(sshd:session): session closed for user zuul
Feb 17 17:01:06 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Feb 17 17:01:06 compute-0 systemd[1]: session-6.scope: Consumed 4.671s CPU time.
Feb 17 17:01:06 compute-0 systemd-logind[806]: Session 6 logged out. Waiting for processes to exit.
Feb 17 17:01:06 compute-0 systemd-logind[806]: Removed session 6.
Feb 17 17:01:39 compute-0 sshd[1016]: Timeout before authentication for connection from 115.190.249.18 to 38.102.83.53, pid = 31421
Feb 17 17:04:02 compute-0 sshd-session[31438]: Received disconnect from 45.148.10.151 port 45896:11:  [preauth]
Feb 17 17:04:02 compute-0 sshd-session[31438]: Disconnected from authenticating user root 45.148.10.151 port 45896 [preauth]
Feb 17 17:07:30 compute-0 systemd[1]: Starting dnf makecache...
Feb 17 17:07:30 compute-0 dnf[31442]: Failed determining last makecache time.
Feb 17 17:07:30 compute-0 dnf[31442]: delorean-openstack-barbican-42b4c41831408a8e323 253 kB/s |  13 kB     00:00
Feb 17 17:07:30 compute-0 dnf[31442]: delorean-python-glean-642fffe0203a8ffcc2443db52 2.7 MB/s |  65 kB     00:00
Feb 17 17:07:30 compute-0 dnf[31442]: delorean-openstack-cinder-1c00d6490d88e436f26ef 1.2 MB/s |  32 kB     00:00
Feb 17 17:07:31 compute-0 dnf[31442]: delorean-python-stevedore-c4acc5639fd2329372142 5.2 MB/s | 131 kB     00:00
Feb 17 17:07:31 compute-0 dnf[31442]: delorean-python-cloudkitty-tests-tempest-783703 1.5 MB/s |  32 kB     00:00
Feb 17 17:07:31 compute-0 dnf[31442]: delorean-diskimage-builder-61b717cc45660834fe9a  11 MB/s | 349 kB     00:00
Feb 17 17:07:31 compute-0 dnf[31442]: delorean-openstack-nova-eaa65f0b85123a4ee343246 567 kB/s |  42 kB     00:00
Feb 17 17:07:31 compute-0 dnf[31442]: delorean-python-designate-tests-tempest-347fdbc 271 kB/s |  18 kB     00:00
Feb 17 17:07:31 compute-0 dnf[31442]: delorean-openstack-glance-1fd12c29b339f30fe823e 311 kB/s |  18 kB     00:00
Feb 17 17:07:31 compute-0 dnf[31442]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 727 kB/s |  29 kB     00:00
Feb 17 17:07:31 compute-0 dnf[31442]: delorean-openstack-manila-d783d10e75495b73866db 989 kB/s |  25 kB     00:00
Feb 17 17:07:31 compute-0 dnf[31442]: delorean-openstack-neutron-95cadbd379667c8520c8 6.2 MB/s | 154 kB     00:00
Feb 17 17:07:31 compute-0 dnf[31442]: delorean-openstack-octavia-5975097dd4b021385178 1.2 MB/s |  26 kB     00:00
Feb 17 17:07:31 compute-0 dnf[31442]: delorean-openstack-watcher-c014f81a8647287f6dcc 868 kB/s |  16 kB     00:00
Feb 17 17:07:31 compute-0 dnf[31442]: delorean-python-tcib-78032d201b02cee27e8e644c61 232 kB/s | 7.4 kB     00:00
Feb 17 17:07:31 compute-0 dnf[31442]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 4.8 MB/s | 144 kB     00:00
Feb 17 17:07:31 compute-0 dnf[31442]: delorean-openstack-swift-dc98a8463506ac520c469a 650 kB/s |  14 kB     00:00
Feb 17 17:07:31 compute-0 dnf[31442]: delorean-python-tempestconf-8515371b7cceebd4282 2.9 MB/s |  53 kB     00:00
Feb 17 17:07:31 compute-0 dnf[31442]: delorean-openstack-heat-ui-013accbfd179753bc3f0 4.3 MB/s |  96 kB     00:00
Feb 17 17:07:32 compute-0 dnf[31442]: CentOS Stream 9 - BaseOS                         61 kB/s | 6.7 kB     00:00
Feb 17 17:07:32 compute-0 dnf[31442]: CentOS Stream 9 - AppStream                      54 kB/s | 6.8 kB     00:00
Feb 17 17:07:32 compute-0 dnf[31442]: CentOS Stream 9 - CRB                            62 kB/s | 6.6 kB     00:00
Feb 17 17:07:32 compute-0 dnf[31442]: CentOS Stream 9 - Extras packages                32 kB/s | 7.6 kB     00:00
Feb 17 17:07:32 compute-0 dnf[31442]: dlrn-antelope-testing                            29 MB/s | 1.1 MB     00:00
Feb 17 17:07:33 compute-0 dnf[31442]: dlrn-antelope-build-deps                         15 MB/s | 461 kB     00:00
Feb 17 17:07:33 compute-0 dnf[31442]: centos9-rabbitmq                                8.1 MB/s | 123 kB     00:00
Feb 17 17:07:33 compute-0 dnf[31442]: centos9-storage                                  23 MB/s | 415 kB     00:00
Feb 17 17:07:33 compute-0 dnf[31442]: centos9-opstools                                410 kB/s |  51 kB     00:00
Feb 17 17:07:33 compute-0 dnf[31442]: NFV SIG OpenvSwitch                             3.3 MB/s | 465 kB     00:00
Feb 17 17:07:34 compute-0 dnf[31442]: repo-setup-centos-appstream                     100 MB/s |  27 MB     00:00
Feb 17 17:07:40 compute-0 dnf[31442]: repo-setup-centos-baseos                         75 MB/s | 8.9 MB     00:00
Feb 17 17:07:40 compute-0 sshd-session[31530]: Accepted publickey for zuul from 192.168.122.30 port 44586 ssh2: ECDSA SHA256:+U7vSwQZvFvLlKGEfHWMlLBUV1LwgLTIxJVbiTi7h3A
Feb 17 17:07:40 compute-0 systemd-logind[806]: New session 7 of user zuul.
Feb 17 17:07:40 compute-0 systemd[1]: Started Session 7 of User zuul.
Feb 17 17:07:40 compute-0 sshd-session[31530]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 17:07:41 compute-0 python3.9[31683]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:07:41 compute-0 dnf[31442]: repo-setup-centos-highavailability               28 MB/s | 744 kB     00:00
Feb 17 17:07:42 compute-0 dnf[31442]: repo-setup-centos-powertools                     92 MB/s | 7.8 MB     00:00
Feb 17 17:07:42 compute-0 sudo[31870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exffzaojouamyzuwgulnymmhdiqgdoqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348062.0972328-27-196247163013397/AnsiballZ_command.py'
Feb 17 17:07:42 compute-0 sudo[31870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:07:42 compute-0 python3.9[31873]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:07:44 compute-0 dnf[31442]: Extra Packages for Enterprise Linux 9 - x86_64   17 MB/s |  20 MB     00:01
Feb 17 17:07:55 compute-0 sudo[31870]: pam_unix(sudo:session): session closed for user root
Feb 17 17:07:55 compute-0 sshd-session[31533]: Connection closed by 192.168.122.30 port 44586
Feb 17 17:07:55 compute-0 sshd-session[31530]: pam_unix(sshd:session): session closed for user zuul
Feb 17 17:07:55 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Feb 17 17:07:55 compute-0 systemd[1]: session-7.scope: Consumed 7.454s CPU time.
Feb 17 17:07:55 compute-0 systemd-logind[806]: Session 7 logged out. Waiting for processes to exit.
Feb 17 17:07:55 compute-0 systemd-logind[806]: Removed session 7.
Feb 17 17:07:57 compute-0 dnf[31442]: Metadata cache created.
Feb 17 17:07:57 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Feb 17 17:07:57 compute-0 systemd[1]: Finished dnf makecache.
Feb 17 17:07:57 compute-0 systemd[1]: dnf-makecache.service: Consumed 24.069s CPU time.
Feb 17 17:08:00 compute-0 sshd-session[31936]: Accepted publickey for zuul from 192.168.122.30 port 59406 ssh2: ECDSA SHA256:+U7vSwQZvFvLlKGEfHWMlLBUV1LwgLTIxJVbiTi7h3A
Feb 17 17:08:00 compute-0 systemd-logind[806]: New session 8 of user zuul.
Feb 17 17:08:00 compute-0 systemd[1]: Started Session 8 of User zuul.
Feb 17 17:08:00 compute-0 sshd-session[31936]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 17:08:01 compute-0 python3.9[32089]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:08:02 compute-0 sshd-session[31939]: Connection closed by 192.168.122.30 port 59406
Feb 17 17:08:02 compute-0 sshd-session[31936]: pam_unix(sshd:session): session closed for user zuul
Feb 17 17:08:02 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Feb 17 17:08:02 compute-0 systemd-logind[806]: Session 8 logged out. Waiting for processes to exit.
Feb 17 17:08:02 compute-0 systemd-logind[806]: Removed session 8.
Feb 17 17:08:16 compute-0 sshd-session[32117]: Accepted publickey for zuul from 192.168.122.30 port 43312 ssh2: ECDSA SHA256:+U7vSwQZvFvLlKGEfHWMlLBUV1LwgLTIxJVbiTi7h3A
Feb 17 17:08:16 compute-0 systemd-logind[806]: New session 9 of user zuul.
Feb 17 17:08:16 compute-0 systemd[1]: Started Session 9 of User zuul.
Feb 17 17:08:16 compute-0 sshd-session[32117]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 17:08:17 compute-0 python3.9[32270]: ansible-ansible.legacy.ping Invoked with data=pong
Feb 17 17:08:18 compute-0 python3.9[32444]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:08:19 compute-0 sudo[32594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulvtjsyhlgzylatrjttwdzlbxoixwvcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348098.9823823-40-103582172027209/AnsiballZ_command.py'
Feb 17 17:08:19 compute-0 sudo[32594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:08:19 compute-0 python3.9[32597]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:08:19 compute-0 sudo[32594]: pam_unix(sudo:session): session closed for user root
Feb 17 17:08:20 compute-0 sudo[32748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phexisdrerfbgkicdfxvfyeumnyobdin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348099.8185987-52-146774203852418/AnsiballZ_stat.py'
Feb 17 17:08:20 compute-0 sudo[32748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:08:20 compute-0 python3.9[32751]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:08:20 compute-0 sudo[32748]: pam_unix(sudo:session): session closed for user root
Feb 17 17:08:21 compute-0 sudo[32901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpcucfmamqhbzfxkqbmgsspkpjglcsrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348100.7432005-60-128058498572113/AnsiballZ_file.py'
Feb 17 17:08:21 compute-0 sudo[32901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:08:21 compute-0 python3.9[32904]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:08:21 compute-0 sudo[32901]: pam_unix(sudo:session): session closed for user root
Feb 17 17:08:21 compute-0 sudo[33054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqvevnepjxucwzkzyuchnwqyrlhknlzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348101.5443053-68-30812904561541/AnsiballZ_stat.py'
Feb 17 17:08:21 compute-0 sudo[33054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:08:21 compute-0 python3.9[33057]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:08:21 compute-0 sudo[33054]: pam_unix(sudo:session): session closed for user root
Feb 17 17:08:22 compute-0 sudo[33178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbtubwfwcfxbijmotvhynyzdujohdrsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348101.5443053-68-30812904561541/AnsiballZ_copy.py'
Feb 17 17:08:22 compute-0 sudo[33178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:08:22 compute-0 python3.9[33181]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1771348101.5443053-68-30812904561541/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:08:22 compute-0 sudo[33178]: pam_unix(sudo:session): session closed for user root
Feb 17 17:08:23 compute-0 sudo[33332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beugbdpvleqwbybyebzsvphvjqerzfzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348102.7825627-83-218638495266541/AnsiballZ_setup.py'
Feb 17 17:08:23 compute-0 sudo[33332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:08:23 compute-0 python3.9[33335]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:08:23 compute-0 sudo[33332]: pam_unix(sudo:session): session closed for user root
Feb 17 17:08:23 compute-0 sudo[33489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osjgmadhigpodtgocnstnrdjwyewgpxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348103.7149804-91-168002976378369/AnsiballZ_file.py'
Feb 17 17:08:23 compute-0 sudo[33489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:08:24 compute-0 python3.9[33492]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:08:24 compute-0 sudo[33489]: pam_unix(sudo:session): session closed for user root
Feb 17 17:08:24 compute-0 sudo[33642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxpeynyhkpcrktbdjwycyedyqgraatds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348104.3154342-100-51570772670241/AnsiballZ_file.py'
Feb 17 17:08:24 compute-0 sudo[33642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:08:24 compute-0 python3.9[33645]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:08:24 compute-0 sudo[33642]: pam_unix(sudo:session): session closed for user root
Feb 17 17:08:25 compute-0 python3.9[33795]: ansible-ansible.builtin.service_facts Invoked
Feb 17 17:08:28 compute-0 python3.9[34049]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:08:29 compute-0 python3.9[34199]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:08:30 compute-0 python3.9[34353]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:08:31 compute-0 sudo[34509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytughtdxcoqvgusosyvrufizcmaprexd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348110.7458608-148-140273498222033/AnsiballZ_setup.py'
Feb 17 17:08:31 compute-0 sudo[34509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:08:31 compute-0 python3.9[34512]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 17 17:08:31 compute-0 sudo[34509]: pam_unix(sudo:session): session closed for user root
Feb 17 17:08:31 compute-0 sudo[34594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwlgmykuxhfehxxzalxgeikkpwjtppwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348110.7458608-148-140273498222033/AnsiballZ_dnf.py'
Feb 17 17:08:31 compute-0 sudo[34594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:08:32 compute-0 python3.9[34597]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 17 17:08:57 compute-0 sshd-session[34742]: Connection closed by 209.38.233.161 port 58088
Feb 17 17:09:07 compute-0 systemd[1]: Reloading.
Feb 17 17:09:07 compute-0 systemd-rc-local-generator[34792]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:09:07 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Feb 17 17:09:07 compute-0 systemd[1]: Reloading.
Feb 17 17:09:07 compute-0 systemd-rc-local-generator[34843]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:09:07 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Feb 17 17:09:07 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Feb 17 17:09:07 compute-0 systemd[1]: Reloading.
Feb 17 17:09:07 compute-0 systemd-rc-local-generator[34889]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:09:08 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Feb 17 17:09:08 compute-0 dbus-broker-launch[781]: Noticed file-system modification, trigger reload.
Feb 17 17:09:08 compute-0 dbus-broker-launch[781]: Noticed file-system modification, trigger reload.
Feb 17 17:09:08 compute-0 dbus-broker-launch[781]: Noticed file-system modification, trigger reload.
Feb 17 17:10:05 compute-0 kernel: SELinux:  Converting 2728 SID table entries...
Feb 17 17:10:05 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 17 17:10:05 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 17 17:10:05 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 17 17:10:05 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 17 17:10:05 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 17 17:10:05 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 17 17:10:05 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 17 17:10:05 compute-0 dbus-broker-launch[789]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Feb 17 17:10:05 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 17 17:10:05 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 17 17:10:05 compute-0 systemd[1]: Reloading.
Feb 17 17:10:05 compute-0 systemd-rc-local-generator[35229]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:10:05 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 17 17:10:06 compute-0 sudo[34594]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:07 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 17 17:10:07 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 17 17:10:07 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.047s CPU time.
Feb 17 17:10:07 compute-0 systemd[1]: run-r430d38b577b14b15abf32b104e33e1ed.service: Deactivated successfully.
Feb 17 17:10:08 compute-0 sudo[36161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxymjqktuynrrqrkmvnkcfgovqwobcak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348206.7884655-160-39909690422177/AnsiballZ_command.py'
Feb 17 17:10:08 compute-0 sudo[36161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:09 compute-0 python3.9[36164]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:10:10 compute-0 sudo[36161]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:10 compute-0 sudo[36443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwinvcgppsizaorvbfpsuulduziwscxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348210.2237408-168-232787597119477/AnsiballZ_selinux.py'
Feb 17 17:10:10 compute-0 sudo[36443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:11 compute-0 python3.9[36446]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Feb 17 17:10:11 compute-0 sudo[36443]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:11 compute-0 sudo[36596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrokcfyghachvcolxttnwvbxsbfvncix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348211.3889718-179-217484988985653/AnsiballZ_command.py'
Feb 17 17:10:11 compute-0 sudo[36596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:11 compute-0 python3.9[36599]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Feb 17 17:10:12 compute-0 sudo[36596]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:12 compute-0 sudo[36750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-forioqahkxjhjroeonzhvokwjmsirtjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348212.4615371-187-58336145702321/AnsiballZ_file.py'
Feb 17 17:10:12 compute-0 sudo[36750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:13 compute-0 python3.9[36753]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:10:13 compute-0 sudo[36750]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:14 compute-0 sudo[36903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpootylshlnaoacoseizzrciukmyvvic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348213.8180504-195-174209264958647/AnsiballZ_mount.py'
Feb 17 17:10:14 compute-0 sudo[36903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:14 compute-0 python3.9[36906]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Feb 17 17:10:14 compute-0 sudo[36903]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:15 compute-0 sudo[37056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oocyhaumakgsdpwpnwxnkgaijmmqnslp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348215.4474807-223-20447573887079/AnsiballZ_file.py'
Feb 17 17:10:15 compute-0 sudo[37056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:15 compute-0 python3.9[37059]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:10:15 compute-0 sudo[37056]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:16 compute-0 sudo[37209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptupnrmnziimmanvwsxhsoykuhhxxvaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348216.0481062-231-214098886843087/AnsiballZ_stat.py'
Feb 17 17:10:16 compute-0 sudo[37209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:16 compute-0 python3.9[37212]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:10:16 compute-0 sudo[37209]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:16 compute-0 sudo[37333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyrzgdbueqfjygszunzluunrjzconwqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348216.0481062-231-214098886843087/AnsiballZ_copy.py'
Feb 17 17:10:16 compute-0 sudo[37333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:17 compute-0 python3.9[37336]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348216.0481062-231-214098886843087/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8de607a0b7a24a3cf424fe58664a4768629b5cf8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:10:17 compute-0 sudo[37333]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:17 compute-0 sudo[37487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opjsevbnwzmhjzahkffpgxbmxcbvykyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348217.4708803-255-46447819993954/AnsiballZ_stat.py'
Feb 17 17:10:17 compute-0 sudo[37487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:17 compute-0 python3.9[37490]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:10:17 compute-0 sudo[37487]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:18 compute-0 sudo[37640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptrmnszcwncfcgzwuhveuialfncxedsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348218.035869-263-180433048961440/AnsiballZ_command.py'
Feb 17 17:10:18 compute-0 sudo[37640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:20 compute-0 python3.9[37643]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:10:20 compute-0 sudo[37640]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:21 compute-0 sudo[37794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fegxyiozeagsmzkvybgtfgylragglruh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348220.9902256-271-36026767348446/AnsiballZ_file.py'
Feb 17 17:10:21 compute-0 sudo[37794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:21 compute-0 python3.9[37797]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:10:21 compute-0 sudo[37794]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:22 compute-0 sudo[37947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuwqcnnbflbvtccbxnjdjdsnhthnhsrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348221.6552432-282-126416142142437/AnsiballZ_getent.py'
Feb 17 17:10:22 compute-0 sudo[37947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:22 compute-0 python3.9[37950]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Feb 17 17:10:22 compute-0 sudo[37947]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:22 compute-0 rsyslogd[1015]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 17 17:10:22 compute-0 rsyslogd[1015]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 17 17:10:22 compute-0 sudo[38102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwtmcxgpuybtygknzmgjeqvgcqrodaha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348222.412301-290-157228938286188/AnsiballZ_group.py'
Feb 17 17:10:22 compute-0 sudo[38102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:23 compute-0 python3.9[38105]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 17 17:10:23 compute-0 groupadd[38106]: group added to /etc/group: name=qemu, GID=107
Feb 17 17:10:23 compute-0 groupadd[38106]: group added to /etc/gshadow: name=qemu
Feb 17 17:10:23 compute-0 groupadd[38106]: new group: name=qemu, GID=107
Feb 17 17:10:23 compute-0 sudo[38102]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:23 compute-0 sudo[38261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tspxvqxleeczeewbbqqasquywwyteakd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348223.2595031-298-43670557915450/AnsiballZ_user.py'
Feb 17 17:10:23 compute-0 sudo[38261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:23 compute-0 python3.9[38264]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 17 17:10:23 compute-0 useradd[38266]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/1
Feb 17 17:10:23 compute-0 sudo[38261]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:24 compute-0 sudo[38422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zngyszqocegietlbwgyxnkgkzlforezs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348224.1270235-306-241183435622439/AnsiballZ_getent.py'
Feb 17 17:10:24 compute-0 sudo[38422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:24 compute-0 python3.9[38425]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Feb 17 17:10:24 compute-0 sudo[38422]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:24 compute-0 sudo[38576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgqvxkidquzrvtytzafheqzftlhebtau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348224.6734877-314-98883711986576/AnsiballZ_group.py'
Feb 17 17:10:24 compute-0 sudo[38576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:25 compute-0 python3.9[38579]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 17 17:10:25 compute-0 groupadd[38580]: group added to /etc/group: name=hugetlbfs, GID=42477
Feb 17 17:10:25 compute-0 groupadd[38580]: group added to /etc/gshadow: name=hugetlbfs
Feb 17 17:10:25 compute-0 groupadd[38580]: new group: name=hugetlbfs, GID=42477
Feb 17 17:10:25 compute-0 sudo[38576]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:25 compute-0 sudo[38735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oijrrfszanprrcwewbgkexvzdbtaxijn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348225.3787394-323-218739706342015/AnsiballZ_file.py'
Feb 17 17:10:25 compute-0 sudo[38735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:25 compute-0 python3.9[38738]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Feb 17 17:10:25 compute-0 sudo[38735]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:26 compute-0 sudo[38888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqpeeygiaxzuxvvyyxgsrhexxaqghdhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348226.1199727-334-193743153449467/AnsiballZ_dnf.py'
Feb 17 17:10:26 compute-0 sudo[38888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:26 compute-0 python3.9[38891]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 17 17:10:28 compute-0 sudo[38888]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:28 compute-0 sudo[39042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pacfyzqlgsmsnclrwtltqfjgecbksxmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348228.1999924-342-281108897979326/AnsiballZ_file.py'
Feb 17 17:10:28 compute-0 sudo[39042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:28 compute-0 python3.9[39045]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:10:28 compute-0 sudo[39042]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:29 compute-0 sudo[39195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obczagrcvhmjfolnyzgqhumkyygyleae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348228.8587444-350-67716426834142/AnsiballZ_stat.py'
Feb 17 17:10:29 compute-0 sudo[39195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:29 compute-0 python3.9[39198]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:10:29 compute-0 sudo[39195]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:29 compute-0 sudo[39319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imqubbxmrmhsbiddipcyflouehavkfgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348228.8587444-350-67716426834142/AnsiballZ_copy.py'
Feb 17 17:10:29 compute-0 sudo[39319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:29 compute-0 python3.9[39322]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771348228.8587444-350-67716426834142/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:10:29 compute-0 sudo[39319]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:30 compute-0 sudo[39472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcaonmtppddstlpneflyrdhlirhlrpvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348230.0679715-365-252195326536766/AnsiballZ_systemd.py'
Feb 17 17:10:30 compute-0 sudo[39472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:30 compute-0 python3.9[39475]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 17 17:10:30 compute-0 systemd[1]: Starting Load Kernel Modules...
Feb 17 17:10:30 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Feb 17 17:10:30 compute-0 kernel: Bridge firewalling registered
Feb 17 17:10:30 compute-0 systemd-modules-load[39479]: Inserted module 'br_netfilter'
Feb 17 17:10:30 compute-0 systemd[1]: Finished Load Kernel Modules.
Feb 17 17:10:31 compute-0 sudo[39472]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:31 compute-0 sudo[39633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bitcjgxeiixtoyfgvgbkrvnupqtrxsti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348231.1541843-373-146348948225714/AnsiballZ_stat.py'
Feb 17 17:10:31 compute-0 sudo[39633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:31 compute-0 python3.9[39636]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:10:31 compute-0 sudo[39633]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:31 compute-0 sudo[39757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfajrbzckmwwhgnhewipaahomddamqre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348231.1541843-373-146348948225714/AnsiballZ_copy.py'
Feb 17 17:10:31 compute-0 sudo[39757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:32 compute-0 python3.9[39760]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771348231.1541843-373-146348948225714/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:10:32 compute-0 sudo[39757]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:32 compute-0 sudo[39910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypyxznyqaidxhcymwgkvbgflaqbbrhrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348232.4232993-391-58719326126285/AnsiballZ_dnf.py'
Feb 17 17:10:32 compute-0 sudo[39910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:32 compute-0 python3.9[39913]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 17 17:10:35 compute-0 dbus-broker-launch[781]: Noticed file-system modification, trigger reload.
Feb 17 17:10:35 compute-0 dbus-broker-launch[781]: Noticed file-system modification, trigger reload.
Feb 17 17:10:35 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 17 17:10:35 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 17 17:10:36 compute-0 systemd[1]: Reloading.
Feb 17 17:10:36 compute-0 systemd-rc-local-generator[39978]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:10:36 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 17 17:10:36 compute-0 sudo[39910]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:37 compute-0 python3.9[41616]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:10:38 compute-0 python3.9[42682]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Feb 17 17:10:38 compute-0 python3.9[43649]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:10:38 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 17 17:10:38 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 17 17:10:38 compute-0 systemd[1]: man-db-cache-update.service: Consumed 3.466s CPU time.
Feb 17 17:10:38 compute-0 systemd[1]: run-r508e0ac389814ca6891318af22065d6c.service: Deactivated successfully.
Feb 17 17:10:39 compute-0 sudo[44162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oengfgvabrztdglvjqcjltjayzkonusu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348238.862407-430-90701131603227/AnsiballZ_command.py'
Feb 17 17:10:39 compute-0 sudo[44162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:39 compute-0 python3.9[44165]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:10:39 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 17 17:10:39 compute-0 systemd[1]: Starting Authorization Manager...
Feb 17 17:10:39 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Feb 17 17:10:39 compute-0 polkitd[44382]: Started polkitd version 0.117
Feb 17 17:10:39 compute-0 polkitd[44382]: Loading rules from directory /etc/polkit-1/rules.d
Feb 17 17:10:39 compute-0 polkitd[44382]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 17 17:10:39 compute-0 polkitd[44382]: Finished loading, compiling and executing 2 rules
Feb 17 17:10:39 compute-0 polkitd[44382]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Feb 17 17:10:39 compute-0 systemd[1]: Started Authorization Manager.
Feb 17 17:10:39 compute-0 sudo[44162]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:40 compute-0 sudo[44550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usohtpzoklqtgrmtlrcbwsiugicdxhid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348240.1596758-439-11477062707816/AnsiballZ_systemd.py'
Feb 17 17:10:40 compute-0 sudo[44550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:40 compute-0 python3.9[44553]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:10:40 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Feb 17 17:10:40 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Feb 17 17:10:40 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Feb 17 17:10:40 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 17 17:10:40 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Feb 17 17:10:41 compute-0 sudo[44550]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:41 compute-0 python3.9[44715]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Feb 17 17:10:43 compute-0 sudo[44865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzsaajppoesfduladfrqshhriduqiggs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348242.9518921-496-259081045861406/AnsiballZ_systemd.py'
Feb 17 17:10:43 compute-0 sudo[44865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:43 compute-0 python3.9[44868]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:10:43 compute-0 systemd[1]: Reloading.
Feb 17 17:10:43 compute-0 systemd-rc-local-generator[44891]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:10:43 compute-0 sudo[44865]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:44 compute-0 sudo[45062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlxoralpewrbucwtymnyrnntdhhgyvqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348243.878809-496-253848528546363/AnsiballZ_systemd.py'
Feb 17 17:10:44 compute-0 sudo[45062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:44 compute-0 python3.9[45065]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:10:44 compute-0 systemd[1]: Reloading.
Feb 17 17:10:44 compute-0 systemd-rc-local-generator[45096]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:10:44 compute-0 sudo[45062]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:45 compute-0 sudo[45259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yymupjqowxlfmbntblkgogzxqyffjmrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348244.88707-512-185932097107072/AnsiballZ_command.py'
Feb 17 17:10:45 compute-0 sudo[45259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:45 compute-0 python3.9[45262]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:10:45 compute-0 sudo[45259]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:45 compute-0 sudo[45413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oefktmnujlqqlhbydgchyehlrkwajjxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348245.4976528-520-259032933501562/AnsiballZ_command.py'
Feb 17 17:10:45 compute-0 sudo[45413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:45 compute-0 python3.9[45416]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:10:45 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Feb 17 17:10:45 compute-0 sudo[45413]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:46 compute-0 sudo[45567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edvjelfnvoaloqztkjtbaeduomiugivw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348246.0829284-528-169722095360328/AnsiballZ_command.py'
Feb 17 17:10:46 compute-0 sudo[45567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:46 compute-0 python3.9[45570]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:10:47 compute-0 sudo[45567]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:48 compute-0 sudo[45730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsivqetbnetqahjdyqlotkaboebsyzrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348248.058123-536-59080011189464/AnsiballZ_command.py'
Feb 17 17:10:48 compute-0 sudo[45730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:48 compute-0 python3.9[45733]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:10:48 compute-0 sudo[45730]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:48 compute-0 sudo[45884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhdhpmksrvszkcsxboqwiiwhaqohoxfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348248.6050494-544-22172004775937/AnsiballZ_systemd.py'
Feb 17 17:10:48 compute-0 sudo[45884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:49 compute-0 python3.9[45887]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 17 17:10:49 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 17 17:10:49 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Feb 17 17:10:49 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Feb 17 17:10:49 compute-0 systemd[1]: Starting Apply Kernel Variables...
Feb 17 17:10:49 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 17 17:10:49 compute-0 systemd[1]: Finished Apply Kernel Variables.
Feb 17 17:10:49 compute-0 sudo[45884]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:49 compute-0 sshd-session[32120]: Connection closed by 192.168.122.30 port 43312
Feb 17 17:10:49 compute-0 sshd-session[32117]: pam_unix(sshd:session): session closed for user zuul
Feb 17 17:10:49 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Feb 17 17:10:49 compute-0 systemd[1]: session-9.scope: Consumed 1min 55.898s CPU time.
Feb 17 17:10:49 compute-0 systemd-logind[806]: Session 9 logged out. Waiting for processes to exit.
Feb 17 17:10:49 compute-0 systemd-logind[806]: Removed session 9.
Feb 17 17:10:54 compute-0 sshd-session[45918]: Accepted publickey for zuul from 192.168.122.30 port 55280 ssh2: ECDSA SHA256:+U7vSwQZvFvLlKGEfHWMlLBUV1LwgLTIxJVbiTi7h3A
Feb 17 17:10:54 compute-0 systemd-logind[806]: New session 10 of user zuul.
Feb 17 17:10:54 compute-0 systemd[1]: Started Session 10 of User zuul.
Feb 17 17:10:54 compute-0 sshd-session[45918]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 17:10:55 compute-0 python3.9[46071]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:10:56 compute-0 python3.9[46225]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:10:57 compute-0 sudo[46379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtaieixnfkzfmmoxgflklpqybzbilvxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348257.2168033-45-99853781819030/AnsiballZ_command.py'
Feb 17 17:10:57 compute-0 sudo[46379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:57 compute-0 python3.9[46382]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:10:57 compute-0 sudo[46379]: pam_unix(sudo:session): session closed for user root
Feb 17 17:10:58 compute-0 python3.9[46533]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:10:59 compute-0 sudo[46687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwqtmucsvdeiavpverhtoclwxjxnxryz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348259.021015-65-66810090049283/AnsiballZ_setup.py'
Feb 17 17:10:59 compute-0 sudo[46687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:10:59 compute-0 python3.9[46690]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 17 17:10:59 compute-0 sudo[46687]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:00 compute-0 sudo[46772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbqwovinxmqdxoromhejyghfeyzefbpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348259.021015-65-66810090049283/AnsiballZ_dnf.py'
Feb 17 17:11:00 compute-0 sudo[46772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:00 compute-0 python3.9[46775]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 17 17:11:01 compute-0 sudo[46772]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:01 compute-0 sudo[46926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwstghehsnhrfqfarolfsbcwpdlvvpvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348261.675589-77-76395304308256/AnsiballZ_setup.py'
Feb 17 17:11:01 compute-0 sudo[46926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:02 compute-0 python3.9[46929]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 17 17:11:02 compute-0 sudo[46926]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:03 compute-0 sudo[47098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkpuedwgpcbxaccliurajxkxchwwslxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348262.4722483-88-32193697901266/AnsiballZ_file.py'
Feb 17 17:11:03 compute-0 sudo[47098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:03 compute-0 python3.9[47101]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:11:03 compute-0 sudo[47098]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:03 compute-0 sudo[47251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uptorsfrntgvdvvbnjvvagwwcqfrvgkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348263.4670343-96-25039730566094/AnsiballZ_command.py'
Feb 17 17:11:03 compute-0 sudo[47251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:03 compute-0 python3.9[47254]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:11:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat1665285701-merged.mount: Deactivated successfully.
Feb 17 17:11:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck1076895588-merged.mount: Deactivated successfully.
Feb 17 17:11:03 compute-0 podman[47255]: 2026-02-17 17:11:03.981744513 +0000 UTC m=+0.066390619 system refresh
Feb 17 17:11:04 compute-0 sudo[47251]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:04 compute-0 sudo[47415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubdohfsyuihqiypzstnaciwgbogoxbel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348264.1683438-104-51306350622382/AnsiballZ_stat.py'
Feb 17 17:11:04 compute-0 sudo[47415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:04 compute-0 python3.9[47418]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:11:04 compute-0 sudo[47415]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 17 17:11:05 compute-0 sudo[47541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynndkxoxzfkqrrwtcqeyckiwyegajxom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348264.1683438-104-51306350622382/AnsiballZ_copy.py'
Feb 17 17:11:05 compute-0 sudo[47541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:05 compute-0 python3.9[47544]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348264.1683438-104-51306350622382/.source.json follow=False _original_basename=podman_network_config.j2 checksum=a33aa23afa41eb77ff026f8283ed844ed3bc8e82 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:11:05 compute-0 sudo[47541]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:05 compute-0 sudo[47694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sguoyobnocllcxnjbfmfkrvldwfaqnko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348265.5842164-119-250858034929807/AnsiballZ_stat.py'
Feb 17 17:11:05 compute-0 sudo[47694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:05 compute-0 sshd-session[47419]: Connection closed by authenticating user root 209.38.233.161 port 55748 [preauth]
Feb 17 17:11:06 compute-0 python3.9[47697]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:11:06 compute-0 sudo[47694]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:06 compute-0 sudo[47818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftbmsfjrpcfdnlfpqsoihfwcufvpnrsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348265.5842164-119-250858034929807/AnsiballZ_copy.py'
Feb 17 17:11:06 compute-0 sudo[47818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:06 compute-0 python3.9[47821]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771348265.5842164-119-250858034929807/.source.conf follow=False _original_basename=registries.conf.j2 checksum=d987b949eaca6ee61c2461c1b8dc7f701ea74149 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:11:06 compute-0 sudo[47818]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:07 compute-0 sudo[47971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klzziiwhmsvwsvziklydniliwguprvmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348266.705909-135-69395840524714/AnsiballZ_ini_file.py'
Feb 17 17:11:07 compute-0 sudo[47971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:07 compute-0 python3.9[47974]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:11:07 compute-0 sudo[47971]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:07 compute-0 sudo[48124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzgyggircntgrcovcuomzrpytlxgfikm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348267.3957827-135-231889303804795/AnsiballZ_ini_file.py'
Feb 17 17:11:07 compute-0 sudo[48124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:07 compute-0 python3.9[48127]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:11:07 compute-0 sudo[48124]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:08 compute-0 sudo[48277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vosqlugegnnunrabjxupbvnbgtszjwqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348267.9605014-135-51336245271003/AnsiballZ_ini_file.py'
Feb 17 17:11:08 compute-0 sudo[48277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:08 compute-0 python3.9[48280]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:11:08 compute-0 sudo[48277]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:08 compute-0 sudo[48430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zebqtafzwjfzhyvhfojweyjeuxbtwfsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348268.5346937-135-200249141342978/AnsiballZ_ini_file.py'
Feb 17 17:11:08 compute-0 sudo[48430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:08 compute-0 python3.9[48433]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:11:08 compute-0 sudo[48430]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:09 compute-0 python3.9[48583]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:11:10 compute-0 sudo[48735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qomvgcvtinscmiohhzwfvwfngpxhwyzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348269.8598783-175-192782699182057/AnsiballZ_dnf.py'
Feb 17 17:11:10 compute-0 sudo[48735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:10 compute-0 python3.9[48738]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 17 17:11:11 compute-0 sudo[48735]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:12 compute-0 sudo[48889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ersndntttwysizfdfnzdsjkjtqfjnoau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348271.772826-183-78075112445771/AnsiballZ_dnf.py'
Feb 17 17:11:12 compute-0 sudo[48889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:12 compute-0 python3.9[48892]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 17 17:11:13 compute-0 sudo[48889]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:14 compute-0 sudo[49051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqgckyhbacldwjayxsjfuhskgqhgazlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348274.0890298-193-207671603359470/AnsiballZ_dnf.py'
Feb 17 17:11:14 compute-0 sudo[49051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:14 compute-0 python3.9[49054]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 17 17:11:15 compute-0 sudo[49051]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:16 compute-0 sudo[49205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdehhymtgfopsecjvtsehjkuwnbgwxjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348275.9752412-202-38839479242331/AnsiballZ_dnf.py'
Feb 17 17:11:16 compute-0 sudo[49205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:16 compute-0 python3.9[49208]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 17 17:11:17 compute-0 sudo[49205]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:18 compute-0 sudo[49359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxlintpvnkobvyijtmdxkshiqyipfije ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348278.0330286-213-12074204899837/AnsiballZ_dnf.py'
Feb 17 17:11:18 compute-0 sudo[49359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:18 compute-0 python3.9[49362]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 17 17:11:19 compute-0 sudo[49359]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:20 compute-0 sudo[49516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgqmprtgwajgejyzvjtsmykbgwfttxoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348280.1332042-221-272885775535699/AnsiballZ_dnf.py'
Feb 17 17:11:20 compute-0 sudo[49516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:20 compute-0 python3.9[49519]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 17 17:11:23 compute-0 sudo[49516]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:23 compute-0 sudo[49686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knpibulzhlpubiugupygqifsnabyqgot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348283.2578988-230-265345401930287/AnsiballZ_dnf.py'
Feb 17 17:11:23 compute-0 sudo[49686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:23 compute-0 python3.9[49689]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 17 17:11:24 compute-0 sudo[49686]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:25 compute-0 sudo[49840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxhplydajsrwmfkbakazucucwvaifosl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348285.059037-239-137192554991602/AnsiballZ_dnf.py'
Feb 17 17:11:25 compute-0 sudo[49840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:25 compute-0 python3.9[49843]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 17 17:11:39 compute-0 sudo[49840]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:39 compute-0 sudo[50177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djkegbwkdbsugmhutworcxvldxvvqepp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348299.4190247-248-2935636672146/AnsiballZ_dnf.py'
Feb 17 17:11:39 compute-0 sudo[50177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:39 compute-0 python3.9[50180]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 17 17:11:41 compute-0 sudo[50177]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:41 compute-0 sudo[50334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzynzalmorxeozmgjtrenwoflcthikdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348301.4559135-258-61317202620602/AnsiballZ_dnf.py'
Feb 17 17:11:41 compute-0 sudo[50334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:41 compute-0 python3.9[50337]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 17 17:11:43 compute-0 sudo[50334]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:44 compute-0 sudo[50492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwzkrtkfarsygniftdpmvjvlnqguqdpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348303.6075966-269-77969967387419/AnsiballZ_file.py'
Feb 17 17:11:44 compute-0 sudo[50492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:44 compute-0 python3.9[50495]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:11:44 compute-0 sudo[50492]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:45 compute-0 sudo[50668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elnhzntqigozoeqaevdlrjccgczsdzne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348305.0191467-277-237004819474293/AnsiballZ_stat.py'
Feb 17 17:11:45 compute-0 sudo[50668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:45 compute-0 python3.9[50671]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:11:45 compute-0 sudo[50668]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:45 compute-0 sudo[50792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuyjfrqyednmhrkiwspxoutnpzbpmdgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348305.0191467-277-237004819474293/AnsiballZ_copy.py'
Feb 17 17:11:45 compute-0 sudo[50792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:46 compute-0 python3.9[50795]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1771348305.0191467-277-237004819474293/.source.json _original_basename=.i6rvibym follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:11:46 compute-0 sudo[50792]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:46 compute-0 sudo[50945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqstaeauedjublzoekpcerovixgvspbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348306.37737-295-257521022363440/AnsiballZ_podman_image.py'
Feb 17 17:11:46 compute-0 sudo[50945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:47 compute-0 python3.9[50948]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 17 17:11:47 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 17 17:11:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat2611658229-lower\x2dmapped.mount: Deactivated successfully.
Feb 17 17:11:51 compute-0 podman[50960]: 2026-02-17 17:11:51.766337192 +0000 UTC m=+4.658664761 image pull 9f8c6308802db66f6c1100257e3fa9593740e85d82f038b4185cf756493dc94e quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 17 17:11:51 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 17 17:11:51 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 17 17:11:51 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 17 17:11:51 compute-0 sudo[50945]: pam_unix(sudo:session): session closed for user root
Feb 17 17:11:52 compute-0 sudo[51257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlhfvzlttszvfxnysfdtfxzpczrfhrsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348312.197067-306-31411811801648/AnsiballZ_podman_image.py'
Feb 17 17:11:52 compute-0 sudo[51257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:11:52 compute-0 python3.9[51260]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 17 17:11:59 compute-0 podman[51272]: 2026-02-17 17:11:59.5544016 +0000 UTC m=+6.875313000 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 17 17:11:59 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 17 17:11:59 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 17 17:11:59 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 17 17:11:59 compute-0 sudo[51257]: pam_unix(sudo:session): session closed for user root
Feb 17 17:12:00 compute-0 sudo[51567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ringigouyuqlpxaugckvcmdbcpqhxvyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348319.9809787-316-23812950080501/AnsiballZ_podman_image.py'
Feb 17 17:12:00 compute-0 sudo[51567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:12:00 compute-0 python3.9[51570]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 17 17:12:10 compute-0 podman[51581]: 2026-02-17 17:12:10.083633146 +0000 UTC m=+9.658768401 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 17 17:12:10 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 17 17:12:10 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 17 17:12:10 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 17 17:12:10 compute-0 sudo[51567]: pam_unix(sudo:session): session closed for user root
Feb 17 17:12:10 compute-0 sudo[51839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efwlesvpgedgaufxacxnmclozexbnyhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348330.5423286-327-245970233213911/AnsiballZ_podman_image.py'
Feb 17 17:12:10 compute-0 sudo[51839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:12:10 compute-0 python3.9[51842]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 17 17:12:13 compute-0 podman[51854]: 2026-02-17 17:12:13.409721932 +0000 UTC m=+2.450332663 image pull be811c7ef606e5fdf21f4bb60e867487043c4ca0ef316c864692549ee6c1c369 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Feb 17 17:12:13 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 17 17:12:13 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 17 17:12:13 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 17 17:12:13 compute-0 sudo[51839]: pam_unix(sudo:session): session closed for user root
Feb 17 17:12:13 compute-0 sudo[52108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlnwxoripufbxensieblnxlmaolhlpdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348333.6721392-327-83094216163084/AnsiballZ_podman_image.py'
Feb 17 17:12:13 compute-0 sudo[52108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:12:14 compute-0 python3.9[52111]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 17 17:12:16 compute-0 podman[52123]: 2026-02-17 17:12:16.564392499 +0000 UTC m=+2.402900637 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Feb 17 17:12:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 17 17:12:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 17 17:12:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 17 17:12:16 compute-0 sudo[52108]: pam_unix(sudo:session): session closed for user root
Feb 17 17:12:17 compute-0 sshd-session[45921]: Connection closed by 192.168.122.30 port 55280
Feb 17 17:12:17 compute-0 sshd-session[45918]: pam_unix(sshd:session): session closed for user zuul
Feb 17 17:12:17 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Feb 17 17:12:17 compute-0 systemd[1]: session-10.scope: Consumed 1min 31.538s CPU time.
Feb 17 17:12:17 compute-0 systemd-logind[806]: Session 10 logged out. Waiting for processes to exit.
Feb 17 17:12:17 compute-0 systemd-logind[806]: Removed session 10.
Feb 17 17:12:21 compute-0 sshd-session[52272]: Connection closed by authenticating user root 209.38.233.161 port 36394 [preauth]
Feb 17 17:12:21 compute-0 sshd-session[52274]: Accepted publickey for zuul from 192.168.122.30 port 57452 ssh2: ECDSA SHA256:+U7vSwQZvFvLlKGEfHWMlLBUV1LwgLTIxJVbiTi7h3A
Feb 17 17:12:21 compute-0 systemd-logind[806]: New session 11 of user zuul.
Feb 17 17:12:21 compute-0 systemd[1]: Started Session 11 of User zuul.
Feb 17 17:12:21 compute-0 sshd-session[52274]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 17:12:22 compute-0 python3.9[52432]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:12:23 compute-0 sudo[52586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hphxcehmkpjyqsetdncuyvsnagtuqndz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348343.5054288-32-258618977786791/AnsiballZ_getent.py'
Feb 17 17:12:23 compute-0 sudo[52586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:12:24 compute-0 python3.9[52589]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Feb 17 17:12:24 compute-0 sudo[52586]: pam_unix(sudo:session): session closed for user root
Feb 17 17:12:24 compute-0 sudo[52740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkjiqcgbrbbtccyhojwhmpimmgntwyik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348344.1814845-40-217232585050738/AnsiballZ_group.py'
Feb 17 17:12:24 compute-0 sudo[52740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:12:24 compute-0 python3.9[52743]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 17 17:12:24 compute-0 groupadd[52744]: group added to /etc/group: name=openvswitch, GID=42476
Feb 17 17:12:24 compute-0 groupadd[52744]: group added to /etc/gshadow: name=openvswitch
Feb 17 17:12:24 compute-0 groupadd[52744]: new group: name=openvswitch, GID=42476
Feb 17 17:12:24 compute-0 sudo[52740]: pam_unix(sudo:session): session closed for user root
Feb 17 17:12:25 compute-0 sudo[52899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uadmyffspuufasketorclxexlvfwccgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348344.8862493-48-262423953766030/AnsiballZ_user.py'
Feb 17 17:12:25 compute-0 sudo[52899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:12:25 compute-0 python3.9[52902]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 17 17:12:25 compute-0 useradd[52904]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/1
Feb 17 17:12:25 compute-0 useradd[52904]: add 'openvswitch' to group 'hugetlbfs'
Feb 17 17:12:25 compute-0 useradd[52904]: add 'openvswitch' to shadow group 'hugetlbfs'
Feb 17 17:12:25 compute-0 sudo[52899]: pam_unix(sudo:session): session closed for user root
Feb 17 17:12:26 compute-0 sudo[53060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtcracvdthnxeiueavhqyknmjgjyhsuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348345.9013908-58-203902778693699/AnsiballZ_setup.py'
Feb 17 17:12:26 compute-0 sudo[53060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:12:26 compute-0 python3.9[53063]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 17 17:12:26 compute-0 sudo[53060]: pam_unix(sudo:session): session closed for user root
Feb 17 17:12:26 compute-0 sudo[53145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibtjpheckybpqbmpfcehphsndcekvuqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348345.9013908-58-203902778693699/AnsiballZ_dnf.py'
Feb 17 17:12:26 compute-0 sudo[53145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:12:27 compute-0 python3.9[53148]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 17 17:12:28 compute-0 sudo[53145]: pam_unix(sudo:session): session closed for user root
Feb 17 17:12:29 compute-0 sudo[53308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iywebgqfvnplorcopkgbachbotgcglyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348348.8900871-72-229756805478312/AnsiballZ_dnf.py'
Feb 17 17:12:29 compute-0 sudo[53308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:12:29 compute-0 python3.9[53311]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 17 17:12:43 compute-0 kernel: SELinux:  Converting 2741 SID table entries...
Feb 17 17:12:43 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 17 17:12:43 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 17 17:12:43 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 17 17:12:43 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 17 17:12:43 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 17 17:12:43 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 17 17:12:43 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 17 17:12:43 compute-0 groupadd[53334]: group added to /etc/group: name=unbound, GID=994
Feb 17 17:12:43 compute-0 groupadd[53334]: group added to /etc/gshadow: name=unbound
Feb 17 17:12:43 compute-0 groupadd[53334]: new group: name=unbound, GID=994
Feb 17 17:12:44 compute-0 useradd[53341]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Feb 17 17:12:44 compute-0 dbus-broker-launch[789]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Feb 17 17:12:44 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Feb 17 17:12:46 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 17 17:12:46 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 17 17:12:46 compute-0 systemd[1]: Reloading.
Feb 17 17:12:46 compute-0 systemd-rc-local-generator[53832]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:12:46 compute-0 systemd-sysv-generator[53835]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:12:46 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 17 17:12:48 compute-0 sudo[53308]: pam_unix(sudo:session): session closed for user root
Feb 17 17:12:49 compute-0 sudo[54431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgrgpyxcdbbfduxzwnqotxzycjciskph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348368.4901192-80-255972534177962/AnsiballZ_systemd.py'
Feb 17 17:12:49 compute-0 sudo[54431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:12:49 compute-0 python3.9[54434]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 17 17:12:49 compute-0 systemd[1]: Reloading.
Feb 17 17:12:49 compute-0 systemd-sysv-generator[54473]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:12:49 compute-0 systemd-rc-local-generator[54467]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:12:49 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Feb 17 17:12:49 compute-0 chown[54483]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Feb 17 17:12:49 compute-0 ovs-ctl[54488]: /etc/openvswitch/conf.db does not exist ... (warning).
Feb 17 17:12:49 compute-0 ovs-ctl[54488]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Feb 17 17:12:49 compute-0 ovs-ctl[54488]: Starting ovsdb-server [  OK  ]
Feb 17 17:12:49 compute-0 ovs-vsctl[54537]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Feb 17 17:12:50 compute-0 ovs-vsctl[54557]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"ee0cee2f-3200-4f1f-8903-57b18789347d\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Feb 17 17:12:50 compute-0 ovs-ctl[54488]: Configuring Open vSwitch system IDs [  OK  ]
Feb 17 17:12:50 compute-0 ovs-vsctl[54563]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Feb 17 17:12:50 compute-0 ovs-ctl[54488]: Enabling remote OVSDB managers [  OK  ]
Feb 17 17:12:50 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Feb 17 17:12:50 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Feb 17 17:12:50 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 17 17:12:50 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 17 17:12:50 compute-0 systemd[1]: run-r6c050c9834a4424c856902fede68d77c.service: Deactivated successfully.
Feb 17 17:12:50 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Feb 17 17:12:50 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Feb 17 17:12:50 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Feb 17 17:12:50 compute-0 ovs-ctl[54608]: Inserting openvswitch module [  OK  ]
Feb 17 17:12:50 compute-0 ovs-ctl[54577]: Starting ovs-vswitchd [  OK  ]
Feb 17 17:12:50 compute-0 ovs-ctl[54577]: Enabling remote OVSDB managers [  OK  ]
Feb 17 17:12:50 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Feb 17 17:12:50 compute-0 ovs-vsctl[54626]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Feb 17 17:12:50 compute-0 systemd[1]: Starting Open vSwitch...
Feb 17 17:12:50 compute-0 systemd[1]: Finished Open vSwitch.
Feb 17 17:12:50 compute-0 sudo[54431]: pam_unix(sudo:session): session closed for user root
Feb 17 17:12:51 compute-0 python3.9[54777]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:12:51 compute-0 sudo[54927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpznuicaperddkvtcmazggmytnzviqdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348371.5900733-99-72799313072607/AnsiballZ_sefcontext.py'
Feb 17 17:12:51 compute-0 sudo[54927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:12:52 compute-0 python3.9[54930]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Feb 17 17:12:53 compute-0 kernel: SELinux:  Converting 2755 SID table entries...
Feb 17 17:12:53 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 17 17:12:53 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 17 17:12:53 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 17 17:12:53 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 17 17:12:53 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 17 17:12:53 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 17 17:12:53 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 17 17:12:53 compute-0 sudo[54927]: pam_unix(sudo:session): session closed for user root
Feb 17 17:12:54 compute-0 python3.9[55085]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:12:55 compute-0 sudo[55241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwxrxkthbobodilqrkvohhqicmxdfawb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348374.9015841-117-87098115709906/AnsiballZ_dnf.py'
Feb 17 17:12:55 compute-0 dbus-broker-launch[789]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Feb 17 17:12:55 compute-0 sudo[55241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:12:55 compute-0 python3.9[55244]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 17 17:12:56 compute-0 sudo[55241]: pam_unix(sudo:session): session closed for user root
Feb 17 17:12:57 compute-0 sudo[55395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-miwcczvmeaddamebvbaboawakqsrohsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348376.7193804-125-256159112436169/AnsiballZ_command.py'
Feb 17 17:12:57 compute-0 sudo[55395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:12:57 compute-0 python3.9[55398]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:12:57 compute-0 sudo[55395]: pam_unix(sudo:session): session closed for user root
Feb 17 17:12:58 compute-0 sudo[55683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zemqbnpupsovninttmywnhpptogkvwql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348378.1043687-133-202560559198033/AnsiballZ_file.py'
Feb 17 17:12:58 compute-0 sudo[55683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:12:58 compute-0 python3.9[55686]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Feb 17 17:12:58 compute-0 sudo[55683]: pam_unix(sudo:session): session closed for user root
Feb 17 17:12:59 compute-0 python3.9[55836]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:12:59 compute-0 sudo[55988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqwmazaoyggtyjdtdmtzuothluycydii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348379.6021037-149-181959880326862/AnsiballZ_dnf.py'
Feb 17 17:12:59 compute-0 sudo[55988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:00 compute-0 python3.9[55991]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 17 17:13:01 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 17 17:13:01 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 17 17:13:01 compute-0 systemd[1]: Reloading.
Feb 17 17:13:01 compute-0 systemd-rc-local-generator[56027]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:13:01 compute-0 systemd-sysv-generator[56031]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:13:01 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 17 17:13:02 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 17 17:13:02 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 17 17:13:02 compute-0 systemd[1]: run-r4f974660d82c4052812534213b67f3dc.service: Deactivated successfully.
Feb 17 17:13:02 compute-0 sudo[55988]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:02 compute-0 sudo[56314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epvifodnaoxubrtzqxrijjbckejtddpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348382.415168-157-153622461126732/AnsiballZ_systemd.py'
Feb 17 17:13:02 compute-0 sudo[56314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:02 compute-0 python3.9[56317]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 17 17:13:03 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb 17 17:13:03 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Feb 17 17:13:03 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Feb 17 17:13:03 compute-0 systemd[1]: Stopping Network Manager...
Feb 17 17:13:03 compute-0 NetworkManager[7685]: <info>  [1771348383.0203] caught SIGTERM, shutting down normally.
Feb 17 17:13:03 compute-0 NetworkManager[7685]: <info>  [1771348383.0217] dhcp4 (eth0): canceled DHCP transaction
Feb 17 17:13:03 compute-0 NetworkManager[7685]: <info>  [1771348383.0217] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 17 17:13:03 compute-0 NetworkManager[7685]: <info>  [1771348383.0217] dhcp4 (eth0): state changed no lease
Feb 17 17:13:03 compute-0 NetworkManager[7685]: <info>  [1771348383.0219] manager: NetworkManager state is now CONNECTED_SITE
Feb 17 17:13:03 compute-0 NetworkManager[7685]: <info>  [1771348383.0275] exiting (success)
Feb 17 17:13:03 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 17 17:13:03 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Feb 17 17:13:03 compute-0 systemd[1]: Stopped Network Manager.
Feb 17 17:13:03 compute-0 systemd[1]: NetworkManager.service: Consumed 11.925s CPU time, 4.1M memory peak, read 0B from disk, written 16.5K to disk.
Feb 17 17:13:03 compute-0 systemd[1]: Starting Network Manager...
Feb 17 17:13:03 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.0937] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:1da25986-a7c7-4f2a-b760-e7b6d26f1215)
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.0938] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1009] manager[0x5597cb3fb000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 17 17:13:03 compute-0 systemd[1]: Starting Hostname Service...
Feb 17 17:13:03 compute-0 systemd[1]: Started Hostname Service.
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1666] hostname: hostname: using hostnamed
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1669] hostname: static hostname changed from (none) to "compute-0"
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1677] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1681] manager[0x5597cb3fb000]: rfkill: Wi-Fi hardware radio set enabled
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1681] manager[0x5597cb3fb000]: rfkill: WWAN hardware radio set enabled
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1702] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1709] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1709] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1710] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1710] manager: Networking is enabled by state file
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1712] settings: Loaded settings plugin: keyfile (internal)
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1715] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1735] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1742] dhcp: init: Using DHCP client 'internal'
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1745] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1749] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1754] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1761] device (lo): Activation: starting connection 'lo' (408cebb5-d164-4b54-9d84-326ea0ceda94)
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1766] device (eth0): carrier: link connected
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1769] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1773] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1774] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1778] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1783] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1787] device (eth1): carrier: link connected
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1790] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1793] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (90d80dee-6d07-52d6-8d2c-113ca6c279fe) (indicated)
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1794] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1797] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1802] device (eth1): Activation: starting connection 'ci-private-network' (90d80dee-6d07-52d6-8d2c-113ca6c279fe)
Feb 17 17:13:03 compute-0 systemd[1]: Started Network Manager.
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1809] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1817] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1819] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1821] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1822] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1825] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1827] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1829] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1831] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1838] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1840] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1848] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1858] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1882] dhcp4 (eth0): state changed new lease, address=38.102.83.53
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1887] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1960] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1967] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1968] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1969] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1975] device (lo): Activation: successful, device activated.
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1982] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1985] manager: NetworkManager state is now CONNECTED_LOCAL
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1987] device (eth1): Activation: successful, device activated.
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1997] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.1998] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.2000] manager: NetworkManager state is now CONNECTED_SITE
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.2002] device (eth0): Activation: successful, device activated.
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.2007] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 17 17:13:03 compute-0 NetworkManager[56323]: <info>  [1771348383.2009] manager: startup complete
Feb 17 17:13:03 compute-0 systemd[1]: Starting Network Manager Wait Online...
Feb 17 17:13:03 compute-0 sudo[56314]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:03 compute-0 systemd[1]: Finished Network Manager Wait Online.
Feb 17 17:13:03 compute-0 sudo[56541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsjxnaepuprznosvonoopqjqnhtzmpur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348383.3908706-165-267078290040781/AnsiballZ_dnf.py'
Feb 17 17:13:03 compute-0 sudo[56541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:03 compute-0 python3.9[56544]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 17 17:13:07 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 17 17:13:07 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 17 17:13:07 compute-0 systemd[1]: Reloading.
Feb 17 17:13:07 compute-0 systemd-sysv-generator[56598]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:13:07 compute-0 systemd-rc-local-generator[56591]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:13:08 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 17 17:13:08 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 17 17:13:08 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 17 17:13:08 compute-0 systemd[1]: run-r262a935a8d354f838c43111a9b503a5a.service: Deactivated successfully.
Feb 17 17:13:08 compute-0 sudo[56541]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:09 compute-0 sudo[57021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqswlurbqreeffmdjvlwwvvzuslftbsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348389.182381-177-43810220552557/AnsiballZ_stat.py'
Feb 17 17:13:09 compute-0 sudo[57021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:09 compute-0 python3.9[57024]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:13:09 compute-0 sudo[57021]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:10 compute-0 sudo[57174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qstzzmbmbkmvxrfkkafsnzwjljilonkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348389.7746966-186-25161214409787/AnsiballZ_ini_file.py'
Feb 17 17:13:10 compute-0 sudo[57174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:10 compute-0 python3.9[57177]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:13:10 compute-0 sudo[57174]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:10 compute-0 sudo[57329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehkfdkblzxiyxoefgejaqwvhexjuvmsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348390.5557494-196-122908871280931/AnsiballZ_ini_file.py'
Feb 17 17:13:10 compute-0 sudo[57329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:11 compute-0 python3.9[57332]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:13:11 compute-0 sudo[57329]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:11 compute-0 sudo[57482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsbfxngjfrufktexkkydlulffekziwdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348391.1795743-196-25422861075541/AnsiballZ_ini_file.py'
Feb 17 17:13:11 compute-0 sudo[57482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:11 compute-0 python3.9[57485]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:13:11 compute-0 sudo[57482]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:12 compute-0 sudo[57635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccotnkrjgjdzdenilzbrvtvbjvjeiqru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348391.8040824-211-75603875356489/AnsiballZ_ini_file.py'
Feb 17 17:13:12 compute-0 sudo[57635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:12 compute-0 python3.9[57638]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:13:12 compute-0 sudo[57635]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:12 compute-0 sudo[57788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-veepzvgrwfiynoehwnxftytcocvvjkhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348392.3797476-211-196254587859135/AnsiballZ_ini_file.py'
Feb 17 17:13:12 compute-0 sudo[57788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:12 compute-0 python3.9[57791]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:13:12 compute-0 sudo[57788]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:13 compute-0 sudo[57941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acqdgbjtavyflkqythgumkynlsnzllcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348393.007672-226-254609086421497/AnsiballZ_stat.py'
Feb 17 17:13:13 compute-0 sudo[57941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:13 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 17 17:13:13 compute-0 python3.9[57944]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:13:13 compute-0 sudo[57941]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:13 compute-0 sudo[58065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmxtscgrtsehxxidmbzexweeyuyspcdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348393.007672-226-254609086421497/AnsiballZ_copy.py'
Feb 17 17:13:13 compute-0 sudo[58065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:14 compute-0 python3.9[58068]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1771348393.007672-226-254609086421497/.source _original_basename=.wl9noz14 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:13:14 compute-0 sudo[58065]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:15 compute-0 sudo[58218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjkzpuiyxxtszvgpofvbnlaottvsistk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348395.1661613-241-27377917032084/AnsiballZ_file.py'
Feb 17 17:13:15 compute-0 sudo[58218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:15 compute-0 python3.9[58221]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:13:15 compute-0 sudo[58218]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:16 compute-0 sudo[58371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqdiefagpbmefxlgatvmrdryyrcexqxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348395.7871232-249-207538771732413/AnsiballZ_edpm_os_net_config_mappings.py'
Feb 17 17:13:16 compute-0 sudo[58371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:16 compute-0 python3.9[58374]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Feb 17 17:13:16 compute-0 sudo[58371]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:16 compute-0 sudo[58524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqwxzbfitgruoqwdhsudscmvmhsogdqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348396.6284375-258-6565803950308/AnsiballZ_file.py'
Feb 17 17:13:16 compute-0 sudo[58524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:17 compute-0 python3.9[58527]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:13:17 compute-0 sudo[58524]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:17 compute-0 sudo[58677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huontbxaqbwxsintyjwzcjzwinlffrjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348397.38002-268-42862340445334/AnsiballZ_stat.py'
Feb 17 17:13:17 compute-0 sudo[58677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:17 compute-0 sudo[58677]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:18 compute-0 sudo[58801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvjvrjxltmfomfooxhrppxbadtnpguqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348397.38002-268-42862340445334/AnsiballZ_copy.py'
Feb 17 17:13:18 compute-0 sudo[58801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:18 compute-0 sudo[58801]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:19 compute-0 sudo[58954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wizwdwanfkppiexsmaomnyrhxmhqtkbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348398.6263-283-193928289930466/AnsiballZ_slurp.py'
Feb 17 17:13:19 compute-0 sudo[58954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:19 compute-0 python3.9[58957]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Feb 17 17:13:19 compute-0 sudo[58954]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:20 compute-0 sudo[59130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kksfppxedkrxfazwecqdpzqwybolvola ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348399.646088-292-276663349133315/async_wrapper.py j588100051892 300 /home/zuul/.ansible/tmp/ansible-tmp-1771348399.646088-292-276663349133315/AnsiballZ_edpm_os_net_config.py _'
Feb 17 17:13:20 compute-0 sudo[59130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:20 compute-0 ansible-async_wrapper.py[59133]: Invoked with j588100051892 300 /home/zuul/.ansible/tmp/ansible-tmp-1771348399.646088-292-276663349133315/AnsiballZ_edpm_os_net_config.py _
Feb 17 17:13:20 compute-0 ansible-async_wrapper.py[59136]: Starting module and watcher
Feb 17 17:13:20 compute-0 ansible-async_wrapper.py[59136]: Start watching 59137 (300)
Feb 17 17:13:20 compute-0 ansible-async_wrapper.py[59137]: Start module (59137)
Feb 17 17:13:20 compute-0 ansible-async_wrapper.py[59133]: Return async_wrapper task started.
Feb 17 17:13:20 compute-0 sudo[59130]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:20 compute-0 python3.9[59138]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True remove_config=False safe_defaults=False use_nmstate=True purge_provider=
Feb 17 17:13:21 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Feb 17 17:13:21 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Feb 17 17:13:21 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Feb 17 17:13:21 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Feb 17 17:13:21 compute-0 kernel: cfg80211: failed to load regulatory.db
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.5482] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59139 uid=0 result="success"
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.5508] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59139 uid=0 result="success"
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6187] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6191] audit: op="connection-add" uuid="cfe8b2b8-175e-415e-935b-552c1d3a78f5" name="br-ex-br" pid=59139 uid=0 result="success"
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6207] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6209] audit: op="connection-add" uuid="aa501322-9101-461e-8466-4db118c21958" name="br-ex-port" pid=59139 uid=0 result="success"
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6223] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6225] audit: op="connection-add" uuid="b3bbf817-4ce8-48d8-9423-a6daa458a5fb" name="eth1-port" pid=59139 uid=0 result="success"
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6239] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6241] audit: op="connection-add" uuid="7883b581-f9fe-40fe-9833-bda81fe9977d" name="vlan20-port" pid=59139 uid=0 result="success"
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6254] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6257] audit: op="connection-add" uuid="fa2661fe-30bd-4697-8ee9-59288bf19501" name="vlan21-port" pid=59139 uid=0 result="success"
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6270] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6272] audit: op="connection-add" uuid="42b65cb7-9e94-493f-8d41-49983b2348ae" name="vlan22-port" pid=59139 uid=0 result="success"
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6295] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.timestamp,connection.autoconnect-priority,ipv4.dhcp-timeout,ipv4.dhcp-client-id,802-3-ethernet.mtu,ipv6.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method" pid=59139 uid=0 result="success"
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6314] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6316] audit: op="connection-add" uuid="12d4b622-1d07-4c5f-b45d-9569cfeaf267" name="br-ex-if" pid=59139 uid=0 result="success"
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6542] audit: op="connection-update" uuid="90d80dee-6d07-52d6-8d2c-113ca6c279fe" name="ci-private-network" args="connection.slave-type,connection.port-type,connection.timestamp,connection.master,connection.controller,ipv4.dns,ipv4.routing-rules,ipv4.never-default,ipv4.routes,ipv4.addresses,ipv4.method,ipv6.dns,ipv6.addr-gen-mode,ipv6.routes,ipv6.addresses,ipv6.method,ipv6.routing-rules,ovs-external-ids.data,ovs-interface.type" pid=59139 uid=0 result="success"
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6563] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6565] audit: op="connection-add" uuid="330f19c0-48ba-457d-a587-8fd67183973c" name="vlan20-if" pid=59139 uid=0 result="success"
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6594] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6597] audit: op="connection-add" uuid="c1e02c8b-2a9c-45b2-bb5d-d5b623eb6cae" name="vlan21-if" pid=59139 uid=0 result="success"
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6627] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6631] audit: op="connection-add" uuid="98981296-f414-4eaf-bc2b-70a78ac99a76" name="vlan22-if" pid=59139 uid=0 result="success"
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6651] audit: op="connection-delete" uuid="67d0a0b6-42d4-3838-8d12-c8bde1f2493b" name="Wired connection 1" pid=59139 uid=0 result="success"
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6673] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <warn>  [1771348402.6678] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6691] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6699] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (cfe8b2b8-175e-415e-935b-552c1d3a78f5)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6702] audit: op="connection-activate" uuid="cfe8b2b8-175e-415e-935b-552c1d3a78f5" name="br-ex-br" pid=59139 uid=0 result="success"
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6708] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <warn>  [1771348402.6711] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6723] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6732] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (aa501322-9101-461e-8466-4db118c21958)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6736] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <warn>  [1771348402.6740] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6750] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6758] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (b3bbf817-4ce8-48d8-9423-a6daa458a5fb)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6764] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <warn>  [1771348402.6767] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6777] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6787] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (7883b581-f9fe-40fe-9833-bda81fe9977d)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6791] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <warn>  [1771348402.6795] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6804] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6813] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (fa2661fe-30bd-4697-8ee9-59288bf19501)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6818] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <warn>  [1771348402.6822] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6832] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6841] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (42b65cb7-9e94-493f-8d41-49983b2348ae)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6844] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6849] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6857] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6871] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <warn>  [1771348402.6872] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6878] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6886] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (12d4b622-1d07-4c5f-b45d-9569cfeaf267)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6887] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6894] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6896] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6899] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6901] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6919] device (eth1): disconnecting for new activation request.
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6921] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6925] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6927] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6929] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6933] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <warn>  [1771348402.6934] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6939] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6945] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (330f19c0-48ba-457d-a587-8fd67183973c)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6947] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6950] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6952] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6954] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6957] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <warn>  [1771348402.6958] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6962] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6966] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (c1e02c8b-2a9c-45b2-bb5d-d5b623eb6cae)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6967] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6971] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6973] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6974] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6978] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <warn>  [1771348402.6978] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6983] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6988] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (98981296-f414-4eaf-bc2b-70a78ac99a76)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6988] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6992] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6994] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6996] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.6998] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7020] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,ipv4.dhcp-timeout,ipv4.dhcp-client-id,802-3-ethernet.mtu,ipv6.addr-gen-mode,ipv6.method" pid=59139 uid=0 result="success"
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7023] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7027] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7029] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7041] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7045] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7057] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7063] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7068] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7076] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 kernel: ovs-system: entered promiscuous mode
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7093] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7098] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7101] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7107] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7112] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7114] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7116] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 kernel: Timeout policy base is empty
Feb 17 17:13:22 compute-0 systemd-udevd[59145]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7129] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7133] dhcp4 (eth0): canceled DHCP transaction
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7133] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7133] dhcp4 (eth0): state changed no lease
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7134] dhcp4 (eth0): activation: beginning transaction (no timeout)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7144] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7148] audit: op="device-reapply" interface="eth1" ifindex=3 pid=59139 uid=0 result="fail" reason="Device is not activated"
Feb 17 17:13:22 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 17 17:13:22 compute-0 kernel: br-ex: entered promiscuous mode
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7377] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7381] dhcp4 (eth0): state changed new lease, address=38.102.83.53
Feb 17 17:13:22 compute-0 systemd-udevd[59144]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:13:22 compute-0 kernel: vlan20: entered promiscuous mode
Feb 17 17:13:22 compute-0 kernel: vlan21: entered promiscuous mode
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7529] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7718] device (eth1): Activation: starting connection 'ci-private-network' (90d80dee-6d07-52d6-8d2c-113ca6c279fe)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7724] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7726] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7727] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7728] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7730] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7731] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7740] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7756] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7762] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7764] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7768] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7773] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7776] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7780] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7783] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7786] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7789] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7792] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7795] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7798] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7800] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7803] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7805] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7809] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7810] device (eth1): state change: config -> deactivating (reason 'new-activation', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7812] device (eth1): released from controller device eth1
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7816] device (eth1): disconnecting for new activation request.
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7816] audit: op="connection-activate" uuid="90d80dee-6d07-52d6-8d2c-113ca6c279fe" name="ci-private-network" pid=59139 uid=0 result="success"
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7828] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7839] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.7843] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8153] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8194] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59139 uid=0 result="success"
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8195] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8202] device (eth1): Activation: starting connection 'ci-private-network' (90d80dee-6d07-52d6-8d2c-113ca6c279fe)
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8205] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8208] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8211] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8214] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8218] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8223] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8224] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8226] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8231] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8234] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8238] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8242] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8247] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8251] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 kernel: vlan22: entered promiscuous mode
Feb 17 17:13:22 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8322] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8325] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8328] device (eth1): Activation: successful, device activated.
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8369] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8382] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8402] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8403] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 17 17:13:22 compute-0 NetworkManager[56323]: <info>  [1771348402.8407] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 17 17:13:23 compute-0 NetworkManager[56323]: <info>  [1771348403.9617] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59139 uid=0 result="success"
Feb 17 17:13:24 compute-0 NetworkManager[56323]: <info>  [1771348404.0949] checkpoint[0x5597cb3d1950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Feb 17 17:13:24 compute-0 NetworkManager[56323]: <info>  [1771348404.0951] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59139 uid=0 result="success"
Feb 17 17:13:24 compute-0 NetworkManager[56323]: <info>  [1771348404.3317] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59139 uid=0 result="success"
Feb 17 17:13:24 compute-0 NetworkManager[56323]: <info>  [1771348404.3331] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59139 uid=0 result="success"
Feb 17 17:13:24 compute-0 sudo[59476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyretbxmbjriynprehckfkzlpqdtymjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348403.8028297-292-74615070082733/AnsiballZ_async_status.py'
Feb 17 17:13:24 compute-0 sudo[59476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:24 compute-0 NetworkManager[56323]: <info>  [1771348404.4912] audit: op="networking-control" arg="global-dns-configuration" pid=59139 uid=0 result="success"
Feb 17 17:13:24 compute-0 NetworkManager[56323]: <info>  [1771348404.4964] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Feb 17 17:13:24 compute-0 NetworkManager[56323]: <info>  [1771348404.5002] audit: op="networking-control" arg="global-dns-configuration" pid=59139 uid=0 result="success"
Feb 17 17:13:24 compute-0 NetworkManager[56323]: <info>  [1771348404.5045] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59139 uid=0 result="success"
Feb 17 17:13:24 compute-0 python3.9[59479]: ansible-ansible.legacy.async_status Invoked with jid=j588100051892.59133 mode=status _async_dir=/root/.ansible_async
Feb 17 17:13:24 compute-0 sudo[59476]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:24 compute-0 NetworkManager[56323]: <info>  [1771348404.6145] checkpoint[0x5597cb3d1a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Feb 17 17:13:24 compute-0 NetworkManager[56323]: <info>  [1771348404.6150] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59139 uid=0 result="success"
Feb 17 17:13:24 compute-0 ansible-async_wrapper.py[59137]: Module complete (59137)
Feb 17 17:13:25 compute-0 ansible-async_wrapper.py[59136]: Done in kid B.
Feb 17 17:13:27 compute-0 sudo[59581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuidurkwlklujztjbtqsvcsieqgmspgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348403.8028297-292-74615070082733/AnsiballZ_async_status.py'
Feb 17 17:13:27 compute-0 sudo[59581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:27 compute-0 python3.9[59584]: ansible-ansible.legacy.async_status Invoked with jid=j588100051892.59133 mode=status _async_dir=/root/.ansible_async
Feb 17 17:13:27 compute-0 sudo[59581]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:28 compute-0 sudo[59682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udiequjgjrlxmxuiqkwiyqbgforaurgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348403.8028297-292-74615070082733/AnsiballZ_async_status.py'
Feb 17 17:13:28 compute-0 sudo[59682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:28 compute-0 python3.9[59685]: ansible-ansible.legacy.async_status Invoked with jid=j588100051892.59133 mode=cleanup _async_dir=/root/.ansible_async
Feb 17 17:13:28 compute-0 sudo[59682]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:28 compute-0 sudo[59835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrxlqlwccjbsjicwxknmebgulmrklwny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348408.5720887-319-51659648277179/AnsiballZ_stat.py'
Feb 17 17:13:28 compute-0 sudo[59835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:29 compute-0 python3.9[59838]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:13:29 compute-0 sudo[59835]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:29 compute-0 sudo[59959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxqqurpwcbtcarkfqqtvgnorstawzpqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348408.5720887-319-51659648277179/AnsiballZ_copy.py'
Feb 17 17:13:29 compute-0 sudo[59959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:29 compute-0 python3.9[59962]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771348408.5720887-319-51659648277179/.source.returncode _original_basename=.m7b8les4 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:13:29 compute-0 sudo[59959]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:30 compute-0 sudo[60112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvvgvnifgdpnzaxupifhwycbzvmdnlwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348409.821521-335-82897668223874/AnsiballZ_stat.py'
Feb 17 17:13:30 compute-0 sudo[60112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:30 compute-0 python3.9[60115]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:13:30 compute-0 sudo[60112]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:30 compute-0 sudo[60236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzvbkpebdcfrbdaobzecscluplwdidnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348409.821521-335-82897668223874/AnsiballZ_copy.py'
Feb 17 17:13:30 compute-0 sudo[60236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:30 compute-0 python3.9[60239]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771348409.821521-335-82897668223874/.source.cfg _original_basename=.oz7md7e6 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:13:30 compute-0 sudo[60236]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:31 compute-0 sudo[60390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tadnkegvgcgcgoerjsjqamtkglitzxij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348410.8311086-350-82660855062441/AnsiballZ_systemd.py'
Feb 17 17:13:31 compute-0 sudo[60390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:31 compute-0 python3.9[60393]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 17 17:13:31 compute-0 systemd[1]: Reloading Network Manager...
Feb 17 17:13:31 compute-0 NetworkManager[56323]: <info>  [1771348411.4526] audit: op="reload" arg="0" pid=60397 uid=0 result="success"
Feb 17 17:13:31 compute-0 NetworkManager[56323]: <info>  [1771348411.4537] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Feb 17 17:13:31 compute-0 systemd[1]: Reloaded Network Manager.
Feb 17 17:13:31 compute-0 sudo[60390]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:31 compute-0 sshd-session[52282]: Connection closed by 192.168.122.30 port 57452
Feb 17 17:13:31 compute-0 sshd-session[52274]: pam_unix(sshd:session): session closed for user zuul
Feb 17 17:13:31 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Feb 17 17:13:31 compute-0 systemd[1]: session-11.scope: Consumed 44.758s CPU time.
Feb 17 17:13:31 compute-0 systemd-logind[806]: Session 11 logged out. Waiting for processes to exit.
Feb 17 17:13:31 compute-0 systemd-logind[806]: Removed session 11.
Feb 17 17:13:33 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 17 17:13:36 compute-0 sshd-session[60429]: Accepted publickey for zuul from 192.168.122.30 port 44732 ssh2: ECDSA SHA256:+U7vSwQZvFvLlKGEfHWMlLBUV1LwgLTIxJVbiTi7h3A
Feb 17 17:13:36 compute-0 systemd-logind[806]: New session 12 of user zuul.
Feb 17 17:13:36 compute-0 systemd[1]: Started Session 12 of User zuul.
Feb 17 17:13:36 compute-0 sshd-session[60429]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 17:13:37 compute-0 python3.9[60582]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:13:37 compute-0 sshd-session[60583]: Connection closed by authenticating user root 209.38.233.161 port 58676 [preauth]
Feb 17 17:13:38 compute-0 python3.9[60739]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 17 17:13:40 compute-0 python3.9[60928]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:13:40 compute-0 sshd-session[60432]: Connection closed by 192.168.122.30 port 44732
Feb 17 17:13:40 compute-0 sshd-session[60429]: pam_unix(sshd:session): session closed for user zuul
Feb 17 17:13:40 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Feb 17 17:13:40 compute-0 systemd[1]: session-12.scope: Consumed 2.100s CPU time.
Feb 17 17:13:40 compute-0 systemd-logind[806]: Session 12 logged out. Waiting for processes to exit.
Feb 17 17:13:40 compute-0 systemd-logind[806]: Removed session 12.
Feb 17 17:13:41 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 17 17:13:45 compute-0 sshd-session[60957]: Accepted publickey for zuul from 192.168.122.30 port 48990 ssh2: ECDSA SHA256:+U7vSwQZvFvLlKGEfHWMlLBUV1LwgLTIxJVbiTi7h3A
Feb 17 17:13:45 compute-0 systemd-logind[806]: New session 13 of user zuul.
Feb 17 17:13:45 compute-0 systemd[1]: Started Session 13 of User zuul.
Feb 17 17:13:45 compute-0 sshd-session[60957]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 17:13:46 compute-0 python3.9[61110]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:13:47 compute-0 python3.9[61264]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:13:47 compute-0 sudo[61419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxwkizqwndpsgugncryapfzpbqpvmrnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348427.730543-35-186939026121700/AnsiballZ_setup.py'
Feb 17 17:13:47 compute-0 sudo[61419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:48 compute-0 python3.9[61422]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 17 17:13:48 compute-0 sudo[61419]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:48 compute-0 sudo[61504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpksgmmwopnhomkbigijeqjxkycuaotc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348427.730543-35-186939026121700/AnsiballZ_dnf.py'
Feb 17 17:13:48 compute-0 sudo[61504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:49 compute-0 python3.9[61507]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 17 17:13:50 compute-0 sudo[61504]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:50 compute-0 sudo[61659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvlweuuckxzizwyonaojtnjyygwmkquv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348430.5632632-47-7428920826096/AnsiballZ_setup.py'
Feb 17 17:13:50 compute-0 sudo[61659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:51 compute-0 python3.9[61662]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 17 17:13:51 compute-0 sudo[61659]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:51 compute-0 sudo[61851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwrnyprlrbdylirxnruqzkquwklkpqun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348431.5255783-58-193475749580672/AnsiballZ_file.py'
Feb 17 17:13:51 compute-0 sudo[61851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:52 compute-0 python3.9[61854]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:13:52 compute-0 sudo[61851]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:52 compute-0 sudo[62004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqinequkhzncjbcownwmpcrtdlaqhoet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348432.2716484-66-261024322770281/AnsiballZ_command.py'
Feb 17 17:13:52 compute-0 sudo[62004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:52 compute-0 python3.9[62007]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:13:52 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 17 17:13:52 compute-0 sudo[62004]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:53 compute-0 sudo[62168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huobwhclrvwzzhzucltbotlboikhzsgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348433.0715353-74-72851027440087/AnsiballZ_stat.py'
Feb 17 17:13:53 compute-0 sudo[62168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:53 compute-0 python3.9[62172]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:13:53 compute-0 sudo[62168]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:53 compute-0 sudo[62248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njivfzxflaelumrhuaezpxtcwwxsqzbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348433.0715353-74-72851027440087/AnsiballZ_file.py'
Feb 17 17:13:53 compute-0 sudo[62248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:54 compute-0 python3.9[62251]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:13:54 compute-0 sudo[62248]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:54 compute-0 sudo[62401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlpatiwjxiqzasbdihfyphiwabgpowtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348434.2546444-86-194033164133590/AnsiballZ_stat.py'
Feb 17 17:13:54 compute-0 sudo[62401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:54 compute-0 python3.9[62404]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:13:54 compute-0 sudo[62401]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:54 compute-0 sudo[62480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msyjbyronjzhiciqvziswtxptlaywipz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348434.2546444-86-194033164133590/AnsiballZ_file.py'
Feb 17 17:13:54 compute-0 sudo[62480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:55 compute-0 python3.9[62483]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:13:55 compute-0 sudo[62480]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:55 compute-0 sudo[62633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzzkbohknzangcbytumtpljzoxaiuxvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348435.3298554-99-48305648373871/AnsiballZ_ini_file.py'
Feb 17 17:13:55 compute-0 sudo[62633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:55 compute-0 python3.9[62636]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:13:55 compute-0 sudo[62633]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:56 compute-0 sudo[62786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psbmqdbyzcfodexprbfqztfyewcnpuqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348435.9904857-99-81254063826571/AnsiballZ_ini_file.py'
Feb 17 17:13:56 compute-0 sudo[62786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:56 compute-0 python3.9[62789]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:13:56 compute-0 sudo[62786]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:56 compute-0 sudo[62939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgdjdmlxgakhhomqvxovtjudzhnpkokv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348436.4943752-99-251625998797611/AnsiballZ_ini_file.py'
Feb 17 17:13:56 compute-0 sudo[62939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:56 compute-0 python3.9[62942]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:13:56 compute-0 sudo[62939]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:57 compute-0 sudo[63092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kghqhigosjllmdsjrvyqaceuughcunbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348437.0328736-99-33438851688112/AnsiballZ_ini_file.py'
Feb 17 17:13:57 compute-0 sudo[63092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:57 compute-0 python3.9[63095]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:13:57 compute-0 sudo[63092]: pam_unix(sudo:session): session closed for user root
Feb 17 17:13:57 compute-0 sudo[63245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xexsuwiglqgdnkxtsxlphpidzrnmzhzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348437.7562437-130-221373515292491/AnsiballZ_dnf.py'
Feb 17 17:13:57 compute-0 sudo[63245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:13:58 compute-0 python3.9[63248]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 17 17:13:59 compute-0 sudo[63245]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:00 compute-0 sudo[63399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxdtfnnccermtcdmvpxiuuuikuyhtavr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348439.8795717-141-178090447035875/AnsiballZ_setup.py'
Feb 17 17:14:00 compute-0 sudo[63399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:00 compute-0 python3.9[63402]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:14:00 compute-0 sudo[63399]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:00 compute-0 sudo[63554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgvsbtdrvpznddqbcsdbeccpyncfrafp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348440.5817175-149-275759619198431/AnsiballZ_stat.py'
Feb 17 17:14:00 compute-0 sudo[63554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:01 compute-0 python3.9[63557]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:14:01 compute-0 sudo[63554]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:01 compute-0 sudo[63707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cewecivgrwlddnnirffywujyoczsqglb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348441.2025216-158-258341012316896/AnsiballZ_stat.py'
Feb 17 17:14:01 compute-0 sudo[63707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:01 compute-0 python3.9[63710]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:14:01 compute-0 sudo[63707]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:02 compute-0 sudo[63860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-diligbdwjzjvelxwxoodcxsawkzhnlna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348441.86798-168-159900652299007/AnsiballZ_command.py'
Feb 17 17:14:02 compute-0 sudo[63860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:02 compute-0 python3.9[63863]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:14:02 compute-0 sudo[63860]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:02 compute-0 sudo[64014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mulxdevkomvbgssbhtxvihxanhushzle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348442.5293083-178-12582446450191/AnsiballZ_service_facts.py'
Feb 17 17:14:02 compute-0 sudo[64014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:03 compute-0 python3.9[64017]: ansible-service_facts Invoked
Feb 17 17:14:03 compute-0 network[64034]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 17 17:14:03 compute-0 network[64035]: 'network-scripts' will be removed from distribution in near future.
Feb 17 17:14:03 compute-0 network[64036]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 17 17:14:06 compute-0 sudo[64014]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:07 compute-0 sudo[64320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxgarqndmfpcfvhsfhhdjpkthempdykz ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1771348446.9091985-193-32764150843563/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1771348446.9091985-193-32764150843563/args'
Feb 17 17:14:07 compute-0 sudo[64320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:07 compute-0 sudo[64320]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:07 compute-0 sudo[64488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smsomrdiriyzixswedufpjlcsrywiwqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348447.4652898-204-144235318295590/AnsiballZ_dnf.py'
Feb 17 17:14:07 compute-0 sudo[64488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:07 compute-0 python3.9[64491]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 17 17:14:09 compute-0 sudo[64488]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:10 compute-0 sudo[64642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrxwispgkimeyypgycucioowfzqaacrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348449.4310446-217-165080848420085/AnsiballZ_package_facts.py'
Feb 17 17:14:10 compute-0 sudo[64642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:10 compute-0 python3.9[64645]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Feb 17 17:14:10 compute-0 sudo[64642]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:11 compute-0 sudo[64795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kphhnugkswpqfbtvmpfanxhcgdempnip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348450.9224944-227-49736017583129/AnsiballZ_stat.py'
Feb 17 17:14:11 compute-0 sudo[64795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:11 compute-0 python3.9[64798]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:14:11 compute-0 sudo[64795]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:11 compute-0 sudo[64921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tysduyxvxvuvufkirqdbchuzihyionhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348450.9224944-227-49736017583129/AnsiballZ_copy.py'
Feb 17 17:14:11 compute-0 sudo[64921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:12 compute-0 python3.9[64924]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771348450.9224944-227-49736017583129/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:14:12 compute-0 sudo[64921]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:12 compute-0 sudo[65076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnowbnzacmqlycveoxxeeuieertwnuej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348452.2843368-242-249477754535788/AnsiballZ_stat.py'
Feb 17 17:14:12 compute-0 sudo[65076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:12 compute-0 python3.9[65079]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:14:12 compute-0 sudo[65076]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:13 compute-0 sudo[65202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iozgrpdsiuyoyglwkskkisetdawltmkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348452.2843368-242-249477754535788/AnsiballZ_copy.py'
Feb 17 17:14:13 compute-0 sudo[65202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:13 compute-0 python3.9[65205]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771348452.2843368-242-249477754535788/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:14:13 compute-0 sudo[65202]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:14 compute-0 sudo[65357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slqnwmnubhyxqizayyudisflxauxpvxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348453.671357-263-5345334086364/AnsiballZ_lineinfile.py'
Feb 17 17:14:14 compute-0 sudo[65357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:14 compute-0 python3.9[65360]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:14:14 compute-0 sudo[65357]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:15 compute-0 sudo[65512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqeldptamqbwpdtrsedyzpagrabeufel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348455.0616453-278-211559194375948/AnsiballZ_setup.py'
Feb 17 17:14:15 compute-0 sudo[65512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:15 compute-0 python3.9[65515]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 17 17:14:15 compute-0 sudo[65512]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:16 compute-0 sudo[65597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuyibrlgjehojcbsudgzudxokcyoskwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348455.0616453-278-211559194375948/AnsiballZ_systemd.py'
Feb 17 17:14:16 compute-0 sudo[65597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:16 compute-0 python3.9[65600]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:14:16 compute-0 sudo[65597]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:17 compute-0 sudo[65752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxtqbebcmjapecinuayvvvxplytlelal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348457.1555088-294-15072309140556/AnsiballZ_setup.py'
Feb 17 17:14:17 compute-0 sudo[65752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:17 compute-0 python3.9[65755]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 17 17:14:17 compute-0 sudo[65752]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:18 compute-0 sudo[65837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khqzvxymlqhskwuudcjcetsdzllsravq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348457.1555088-294-15072309140556/AnsiballZ_systemd.py'
Feb 17 17:14:18 compute-0 sudo[65837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:18 compute-0 python3.9[65840]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 17 17:14:18 compute-0 chronyd[814]: chronyd exiting
Feb 17 17:14:18 compute-0 systemd[1]: Stopping NTP client/server...
Feb 17 17:14:18 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Feb 17 17:14:18 compute-0 systemd[1]: Stopped NTP client/server.
Feb 17 17:14:18 compute-0 systemd[1]: Starting NTP client/server...
Feb 17 17:14:18 compute-0 chronyd[65849]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Feb 17 17:14:18 compute-0 chronyd[65849]: Frequency -23.503 +/- 0.125 ppm read from /var/lib/chrony/drift
Feb 17 17:14:18 compute-0 chronyd[65849]: Loaded seccomp filter (level 2)
Feb 17 17:14:18 compute-0 systemd[1]: Started NTP client/server.
Feb 17 17:14:18 compute-0 sudo[65837]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:19 compute-0 sshd-session[60960]: Connection closed by 192.168.122.30 port 48990
Feb 17 17:14:19 compute-0 sshd-session[60957]: pam_unix(sshd:session): session closed for user zuul
Feb 17 17:14:19 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Feb 17 17:14:19 compute-0 systemd[1]: session-13.scope: Consumed 22.932s CPU time.
Feb 17 17:14:19 compute-0 systemd-logind[806]: Session 13 logged out. Waiting for processes to exit.
Feb 17 17:14:19 compute-0 systemd-logind[806]: Removed session 13.
Feb 17 17:14:24 compute-0 sshd-session[65875]: Accepted publickey for zuul from 192.168.122.30 port 42448 ssh2: ECDSA SHA256:+U7vSwQZvFvLlKGEfHWMlLBUV1LwgLTIxJVbiTi7h3A
Feb 17 17:14:24 compute-0 systemd-logind[806]: New session 14 of user zuul.
Feb 17 17:14:24 compute-0 systemd[1]: Started Session 14 of User zuul.
Feb 17 17:14:24 compute-0 sshd-session[65875]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 17:14:25 compute-0 python3.9[66028]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:14:25 compute-0 sudo[66182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwresgzzfgyuzihymvwmfqejuxvikasl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348465.4401445-28-79990635499680/AnsiballZ_file.py'
Feb 17 17:14:25 compute-0 sudo[66182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:26 compute-0 python3.9[66185]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:14:26 compute-0 sudo[66182]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:26 compute-0 sudo[66358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cniwbkzsxlnylkjkwapftfomgodzeslu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348466.179566-36-62558808934234/AnsiballZ_stat.py'
Feb 17 17:14:26 compute-0 sudo[66358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:26 compute-0 python3.9[66361]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:14:26 compute-0 sudo[66358]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:27 compute-0 sudo[66437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kicypexskvjcpeerlgdswjxqrejhxdne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348466.179566-36-62558808934234/AnsiballZ_file.py'
Feb 17 17:14:27 compute-0 sudo[66437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:27 compute-0 python3.9[66440]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.6po9wgoq recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:14:27 compute-0 sudo[66437]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:27 compute-0 sudo[66590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swkmemcmxoupoxlfmmedtgeynhdrcfbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348467.464686-56-233404786712461/AnsiballZ_stat.py'
Feb 17 17:14:27 compute-0 sudo[66590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:27 compute-0 python3.9[66593]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:14:27 compute-0 sudo[66590]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:28 compute-0 sudo[66714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cclcawbgknrqlbhbbjqnsgoihvqnscgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348467.464686-56-233404786712461/AnsiballZ_copy.py'
Feb 17 17:14:28 compute-0 sudo[66714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:28 compute-0 python3.9[66717]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771348467.464686-56-233404786712461/.source _original_basename=.w6irr22c follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:14:28 compute-0 sudo[66714]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:28 compute-0 sudo[66867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcbsxinawuynfmajnzdodcsmaspbjaoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348468.7345645-72-93274454197585/AnsiballZ_file.py'
Feb 17 17:14:28 compute-0 sudo[66867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:29 compute-0 python3.9[66870]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:14:29 compute-0 sudo[66867]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:29 compute-0 sudo[67020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcjmnjbgxhbmgfgcxmemhrgaipvkouvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348469.3009746-80-278801415284906/AnsiballZ_stat.py'
Feb 17 17:14:29 compute-0 sudo[67020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:29 compute-0 python3.9[67023]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:14:29 compute-0 sudo[67020]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:30 compute-0 sudo[67144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwaxhbjolncvxfrhekqqhycsjsajpanc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348469.3009746-80-278801415284906/AnsiballZ_copy.py'
Feb 17 17:14:30 compute-0 sudo[67144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:30 compute-0 python3.9[67147]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771348469.3009746-80-278801415284906/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:14:30 compute-0 sudo[67144]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:30 compute-0 sudo[67297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wabprkiuuyehyqtlqxfihaixskjmpkos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348470.3286839-80-181772084585233/AnsiballZ_stat.py'
Feb 17 17:14:30 compute-0 sudo[67297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:30 compute-0 python3.9[67300]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:14:30 compute-0 sudo[67297]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:31 compute-0 sudo[67421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdxxwyipngkpnslcpwlioooqudblpxje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348470.3286839-80-181772084585233/AnsiballZ_copy.py'
Feb 17 17:14:31 compute-0 sudo[67421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:31 compute-0 python3.9[67424]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771348470.3286839-80-181772084585233/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:14:31 compute-0 sudo[67421]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:31 compute-0 sudo[67574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-doykdwyihunhwetakhlphmcrtkxdknbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348471.40877-109-253604637407000/AnsiballZ_file.py'
Feb 17 17:14:31 compute-0 sudo[67574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:31 compute-0 python3.9[67577]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:14:31 compute-0 sudo[67574]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:32 compute-0 sudo[67727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzsrrjrtnjjrqrbhobfmbsobnidsftij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348471.93231-117-38734355625906/AnsiballZ_stat.py'
Feb 17 17:14:32 compute-0 sudo[67727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:32 compute-0 python3.9[67730]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:14:32 compute-0 sudo[67727]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:32 compute-0 sudo[67851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydrwjfvzbqaigwkpxyxkjztsrghwvwot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348471.93231-117-38734355625906/AnsiballZ_copy.py'
Feb 17 17:14:32 compute-0 sudo[67851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:32 compute-0 python3.9[67854]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348471.93231-117-38734355625906/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:14:32 compute-0 sudo[67851]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:33 compute-0 sudo[68004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykxgeqgdahrzjhvdggckvigmhuncbrox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348473.0403047-132-59801721756016/AnsiballZ_stat.py'
Feb 17 17:14:33 compute-0 sudo[68004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:33 compute-0 python3.9[68007]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:14:33 compute-0 sudo[68004]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:33 compute-0 sudo[68128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdffsqgoqlgryhdaqypgxixqznmklqaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348473.0403047-132-59801721756016/AnsiballZ_copy.py'
Feb 17 17:14:33 compute-0 sudo[68128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:33 compute-0 python3.9[68131]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348473.0403047-132-59801721756016/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:14:33 compute-0 sudo[68128]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:34 compute-0 sudo[68281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmvfifaxaikgjlericxjdjlzdvmuxnau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348474.1050217-147-67841839161335/AnsiballZ_systemd.py'
Feb 17 17:14:34 compute-0 sudo[68281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:34 compute-0 python3.9[68284]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:14:34 compute-0 systemd[1]: Reloading.
Feb 17 17:14:35 compute-0 systemd-rc-local-generator[68307]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:14:35 compute-0 systemd-sysv-generator[68310]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:14:35 compute-0 systemd[1]: Reloading.
Feb 17 17:14:35 compute-0 systemd-rc-local-generator[68359]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:14:35 compute-0 systemd-sysv-generator[68362]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:14:35 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Feb 17 17:14:35 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Feb 17 17:14:35 compute-0 sudo[68281]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:35 compute-0 sudo[68523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryvmepsmcawppnifiwbygdqjxafpsoer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348475.519563-155-83923902643878/AnsiballZ_stat.py'
Feb 17 17:14:35 compute-0 sudo[68523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:35 compute-0 python3.9[68526]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:14:35 compute-0 sudo[68523]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:36 compute-0 sudo[68647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sukpwgjlefbvikylowygrbqubczptqua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348475.519563-155-83923902643878/AnsiballZ_copy.py'
Feb 17 17:14:36 compute-0 sudo[68647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:36 compute-0 python3.9[68650]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348475.519563-155-83923902643878/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:14:36 compute-0 sudo[68647]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:36 compute-0 sudo[68800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmfqhcgkryuixqxnipurtwqgwrieyuvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348476.5392041-170-174802958735989/AnsiballZ_stat.py'
Feb 17 17:14:36 compute-0 sudo[68800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:36 compute-0 python3.9[68803]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:14:36 compute-0 sudo[68800]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:37 compute-0 sudo[68924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvficsyxyqfhvlvbxgfrakuflmuxzybc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348476.5392041-170-174802958735989/AnsiballZ_copy.py'
Feb 17 17:14:37 compute-0 sudo[68924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:37 compute-0 python3.9[68927]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348476.5392041-170-174802958735989/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:14:37 compute-0 sudo[68924]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:37 compute-0 sudo[69077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxrhvxxqtvtvfvbadunvzriblzanbvwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348477.6303926-185-207828275775137/AnsiballZ_systemd.py'
Feb 17 17:14:37 compute-0 sudo[69077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:38 compute-0 python3.9[69080]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:14:38 compute-0 systemd[1]: Reloading.
Feb 17 17:14:38 compute-0 systemd-rc-local-generator[69103]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:14:38 compute-0 systemd-sysv-generator[69106]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:14:38 compute-0 systemd[1]: Reloading.
Feb 17 17:14:38 compute-0 systemd-sysv-generator[69157]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:14:38 compute-0 systemd-rc-local-generator[69152]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:14:38 compute-0 systemd[1]: Starting Create netns directory...
Feb 17 17:14:38 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 17 17:14:38 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 17 17:14:38 compute-0 systemd[1]: Finished Create netns directory.
Feb 17 17:14:38 compute-0 sudo[69077]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:39 compute-0 python3.9[69319]: ansible-ansible.builtin.service_facts Invoked
Feb 17 17:14:39 compute-0 network[69336]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 17 17:14:39 compute-0 network[69337]: 'network-scripts' will be removed from distribution in near future.
Feb 17 17:14:39 compute-0 network[69338]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 17 17:14:42 compute-0 sudo[69599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngqmjpvxipylvniafmbhwbyuibejmvwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348482.5122364-201-146039388788809/AnsiballZ_systemd.py'
Feb 17 17:14:42 compute-0 sudo[69599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:43 compute-0 python3.9[69602]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:14:43 compute-0 systemd[1]: Reloading.
Feb 17 17:14:43 compute-0 systemd-sysv-generator[69626]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:14:43 compute-0 systemd-rc-local-generator[69621]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:14:43 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Feb 17 17:14:43 compute-0 iptables.init[69649]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Feb 17 17:14:43 compute-0 iptables.init[69649]: iptables: Flushing firewall rules: [  OK  ]
Feb 17 17:14:43 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Feb 17 17:14:43 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Feb 17 17:14:43 compute-0 sudo[69599]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:43 compute-0 sudo[69843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjcdwpmcxilsdzmscmmmshoinqlpuehb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348483.6883109-201-68651766698879/AnsiballZ_systemd.py'
Feb 17 17:14:43 compute-0 sudo[69843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:44 compute-0 python3.9[69846]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:14:44 compute-0 sudo[69843]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:44 compute-0 sudo[69998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojdjijebhkvpfjwamyqoqxvwanrqyfop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348484.4325862-217-66046879955137/AnsiballZ_systemd.py'
Feb 17 17:14:44 compute-0 sudo[69998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:44 compute-0 python3.9[70001]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:14:45 compute-0 systemd[1]: Reloading.
Feb 17 17:14:45 compute-0 systemd-sysv-generator[70035]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:14:45 compute-0 systemd-rc-local-generator[70030]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:14:45 compute-0 systemd[1]: Starting Netfilter Tables...
Feb 17 17:14:45 compute-0 systemd[1]: Finished Netfilter Tables.
Feb 17 17:14:45 compute-0 sudo[69998]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:45 compute-0 sudo[70198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qywkpnfslqpomqfmyoguhhkzcrpuduip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348485.424431-225-256235823541088/AnsiballZ_command.py'
Feb 17 17:14:45 compute-0 sudo[70198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:45 compute-0 python3.9[70201]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:14:46 compute-0 sudo[70198]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:46 compute-0 sudo[70352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlzxlogivnxzdnpvwbhdgxtvtbjpftgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348486.3198178-239-242505028572572/AnsiballZ_stat.py'
Feb 17 17:14:46 compute-0 sudo[70352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:46 compute-0 python3.9[70355]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:14:46 compute-0 sudo[70352]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:47 compute-0 sudo[70478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nthvbqepblxugfjstawewaznjzjdopud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348486.3198178-239-242505028572572/AnsiballZ_copy.py'
Feb 17 17:14:47 compute-0 sudo[70478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:47 compute-0 python3.9[70481]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771348486.3198178-239-242505028572572/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:14:47 compute-0 sudo[70478]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:47 compute-0 sudo[70632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pevoqonuobmmfiwppfoybrokqytxsxfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348487.4225607-254-70654517018972/AnsiballZ_systemd.py'
Feb 17 17:14:47 compute-0 sudo[70632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:47 compute-0 python3.9[70635]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 17 17:14:47 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Feb 17 17:14:47 compute-0 sshd[1016]: Received SIGHUP; restarting.
Feb 17 17:14:47 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Feb 17 17:14:48 compute-0 sshd[1016]: Server listening on 0.0.0.0 port 22.
Feb 17 17:14:48 compute-0 sshd[1016]: Server listening on :: port 22.
Feb 17 17:14:48 compute-0 sudo[70632]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:48 compute-0 sudo[70789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drrltkcigoholsiravsbvascfnpptvpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348488.1791084-262-19626499059018/AnsiballZ_file.py'
Feb 17 17:14:48 compute-0 sudo[70789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:48 compute-0 python3.9[70792]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:14:48 compute-0 sudo[70789]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:48 compute-0 sudo[70942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efcrcgzenamxxxospvdtwzsympougnwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348488.730154-270-225060742835086/AnsiballZ_stat.py'
Feb 17 17:14:48 compute-0 sudo[70942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:49 compute-0 python3.9[70945]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:14:49 compute-0 sudo[70942]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:49 compute-0 sudo[71066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pallnaqerxjxeclfayebystvcffzvthg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348488.730154-270-225060742835086/AnsiballZ_copy.py'
Feb 17 17:14:49 compute-0 sudo[71066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:49 compute-0 python3.9[71069]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348488.730154-270-225060742835086/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:14:49 compute-0 sudo[71066]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:50 compute-0 sudo[71219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwufvbqfthhkjlysnzcxqjyltbvktxaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348489.9873421-288-186917164884906/AnsiballZ_timezone.py'
Feb 17 17:14:50 compute-0 sudo[71219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:50 compute-0 python3.9[71222]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 17 17:14:50 compute-0 systemd[1]: Starting Time & Date Service...
Feb 17 17:14:50 compute-0 systemd[1]: Started Time & Date Service.
Feb 17 17:14:50 compute-0 sudo[71219]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:51 compute-0 sudo[71378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzzoqpfsmyvpyoxpqthxzqomydqewkcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348490.8689864-297-54451802670100/AnsiballZ_file.py'
Feb 17 17:14:51 compute-0 sudo[71378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:51 compute-0 sshd-session[71251]: Connection closed by authenticating user root 209.38.233.161 port 48120 [preauth]
Feb 17 17:14:51 compute-0 python3.9[71381]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:14:51 compute-0 sudo[71378]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:51 compute-0 sudo[71531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-naynhuhmeqggcmpqjfnvextbshaxtppc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348491.4410448-305-243126574681886/AnsiballZ_stat.py'
Feb 17 17:14:51 compute-0 sudo[71531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:51 compute-0 python3.9[71534]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:14:51 compute-0 sudo[71531]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:52 compute-0 sudo[71655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhiqcrdoqwohhmoyoxhxlrwywoavkcov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348491.4410448-305-243126574681886/AnsiballZ_copy.py'
Feb 17 17:14:52 compute-0 sudo[71655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:52 compute-0 python3.9[71658]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771348491.4410448-305-243126574681886/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:14:52 compute-0 sudo[71655]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:52 compute-0 sudo[71808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkedifdcdfppnwmsmxxnimqrwqeheitr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348492.5015771-320-187483825072472/AnsiballZ_stat.py'
Feb 17 17:14:52 compute-0 sudo[71808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:52 compute-0 python3.9[71811]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:14:52 compute-0 sudo[71808]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:53 compute-0 sudo[71932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvcrxhnhbezcfgjkdplhrhhhijtsitdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348492.5015771-320-187483825072472/AnsiballZ_copy.py'
Feb 17 17:14:53 compute-0 sudo[71932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:53 compute-0 python3.9[71935]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771348492.5015771-320-187483825072472/.source.yaml _original_basename=.kon7uwwl follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:14:53 compute-0 sudo[71932]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:53 compute-0 sudo[72085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpdfudbdikdshcftwsolhqwbrikiiiyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348493.5644805-335-122525642613911/AnsiballZ_stat.py'
Feb 17 17:14:53 compute-0 sudo[72085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:53 compute-0 python3.9[72088]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:14:53 compute-0 sudo[72085]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:54 compute-0 sudo[72209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jplcpkzwdaknqhytxkncqunoxjydiwbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348493.5644805-335-122525642613911/AnsiballZ_copy.py'
Feb 17 17:14:54 compute-0 sudo[72209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:54 compute-0 python3.9[72212]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348493.5644805-335-122525642613911/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:14:54 compute-0 sudo[72209]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:54 compute-0 sudo[72362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyprmttnwqoopuhrxxjeahzdwhwqkpuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348494.7175713-350-43548178490330/AnsiballZ_command.py'
Feb 17 17:14:54 compute-0 sudo[72362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:55 compute-0 python3.9[72365]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:14:55 compute-0 sudo[72362]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:55 compute-0 sudo[72516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvykmxshztvmhndljtjqzrupgpxsxril ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348495.2790265-358-165293643679689/AnsiballZ_command.py'
Feb 17 17:14:55 compute-0 sudo[72516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:55 compute-0 python3.9[72519]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:14:55 compute-0 sudo[72516]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:56 compute-0 sudo[72670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhobusterqltbinughjauhlrwsimdbky ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771348495.8810651-366-175321461088165/AnsiballZ_edpm_nftables_from_files.py'
Feb 17 17:14:56 compute-0 sudo[72670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:56 compute-0 python3[72673]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 17 17:14:56 compute-0 sudo[72670]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:56 compute-0 sudo[72823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbqkzdmzuymrenjakuuxanoxranjvuch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348496.6412683-374-25211593138763/AnsiballZ_stat.py'
Feb 17 17:14:56 compute-0 sudo[72823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:57 compute-0 python3.9[72826]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:14:57 compute-0 sudo[72823]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:57 compute-0 sudo[72947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srncuaklbtjjrvqiwzjpvincqijgnrpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348496.6412683-374-25211593138763/AnsiballZ_copy.py'
Feb 17 17:14:57 compute-0 sudo[72947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:57 compute-0 python3.9[72950]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348496.6412683-374-25211593138763/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:14:57 compute-0 sudo[72947]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:58 compute-0 sudo[73100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajettbbatklvavqigcrmedqzuzlbronb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348497.7553754-389-109769490018240/AnsiballZ_stat.py'
Feb 17 17:14:58 compute-0 sudo[73100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:58 compute-0 python3.9[73103]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:14:58 compute-0 sudo[73100]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:58 compute-0 sudo[73224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xymzungzmxvydrqrukvndzjcvfcypyou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348497.7553754-389-109769490018240/AnsiballZ_copy.py'
Feb 17 17:14:58 compute-0 sudo[73224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:58 compute-0 python3.9[73227]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348497.7553754-389-109769490018240/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:14:58 compute-0 sudo[73224]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:59 compute-0 sudo[73377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxkvcbuixuknjemazqpxhyffzywnyvri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348498.8843656-404-15279087880819/AnsiballZ_stat.py'
Feb 17 17:14:59 compute-0 sudo[73377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:59 compute-0 python3.9[73380]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:14:59 compute-0 sudo[73377]: pam_unix(sudo:session): session closed for user root
Feb 17 17:14:59 compute-0 sudo[73501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwndnkokvlhhqjoocxkngqycatdmvetp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348498.8843656-404-15279087880819/AnsiballZ_copy.py'
Feb 17 17:14:59 compute-0 sudo[73501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:14:59 compute-0 python3.9[73504]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348498.8843656-404-15279087880819/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:14:59 compute-0 sudo[73501]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:00 compute-0 sudo[73654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kigarbcmqpvxsoswwnozububwgatvegv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348499.910989-419-26691276217831/AnsiballZ_stat.py'
Feb 17 17:15:00 compute-0 sudo[73654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:00 compute-0 python3.9[73657]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:15:00 compute-0 sudo[73654]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:00 compute-0 sudo[73778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwesrmvwydfmraqwpwwwtpzslweglzqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348499.910989-419-26691276217831/AnsiballZ_copy.py'
Feb 17 17:15:00 compute-0 sudo[73778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:00 compute-0 python3.9[73781]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348499.910989-419-26691276217831/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:15:00 compute-0 sudo[73778]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:01 compute-0 sudo[73931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spbhybandcbundurpxnialcozurfetvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348500.9820118-434-167388973162370/AnsiballZ_stat.py'
Feb 17 17:15:01 compute-0 sudo[73931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:01 compute-0 python3.9[73934]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:15:01 compute-0 sudo[73931]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:01 compute-0 sudo[74055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-seywxsmjhjmiknoyxwaelllglvncltrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348500.9820118-434-167388973162370/AnsiballZ_copy.py'
Feb 17 17:15:01 compute-0 sudo[74055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:02 compute-0 python3.9[74058]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348500.9820118-434-167388973162370/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:15:02 compute-0 sudo[74055]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:02 compute-0 sudo[74208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zozzwyjdldrkgkvcqbofoxdainylofck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348502.34311-449-43755119807559/AnsiballZ_file.py'
Feb 17 17:15:02 compute-0 sudo[74208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:02 compute-0 python3.9[74211]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:15:02 compute-0 sudo[74208]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:03 compute-0 sudo[74361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixghghfhrsyvdepbzverbopxznzxwdwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348502.8982441-457-67807645665715/AnsiballZ_command.py'
Feb 17 17:15:03 compute-0 sudo[74361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:03 compute-0 python3.9[74364]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:15:03 compute-0 sudo[74361]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:03 compute-0 sudo[74521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwfwkyouolhsjwptajtplzdtlmxacfan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348503.5991096-465-266821121594438/AnsiballZ_blockinfile.py'
Feb 17 17:15:03 compute-0 sudo[74521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:04 compute-0 python3.9[74524]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:15:04 compute-0 sudo[74521]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:04 compute-0 sudo[74675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoibfpvcrjlacdclnxqfhzpglsktqmzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348504.3788826-474-214168295287920/AnsiballZ_file.py'
Feb 17 17:15:04 compute-0 sudo[74675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:04 compute-0 python3.9[74678]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:15:04 compute-0 sudo[74675]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:05 compute-0 sudo[74828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slirkflannpkdgqyvkcyylspputrhave ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348504.9794161-474-52649332465010/AnsiballZ_file.py'
Feb 17 17:15:05 compute-0 sudo[74828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:05 compute-0 python3.9[74831]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:15:05 compute-0 sudo[74828]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:06 compute-0 sudo[74981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdfgqvyxymxcpqqwxqhwliadxphzngwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348505.5969067-489-230992020228179/AnsiballZ_mount.py'
Feb 17 17:15:06 compute-0 sudo[74981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:06 compute-0 python3.9[74984]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 17 17:15:06 compute-0 sudo[74981]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:06 compute-0 rsyslogd[1015]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 17 17:15:06 compute-0 rsyslogd[1015]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 17 17:15:06 compute-0 sudo[75136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjueobnkipnmzwecjechszlslvqnglij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348506.420605-489-216192931930343/AnsiballZ_mount.py'
Feb 17 17:15:06 compute-0 sudo[75136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:06 compute-0 python3.9[75139]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 17 17:15:06 compute-0 sudo[75136]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:07 compute-0 sshd-session[65878]: Connection closed by 192.168.122.30 port 42448
Feb 17 17:15:07 compute-0 sshd-session[65875]: pam_unix(sshd:session): session closed for user zuul
Feb 17 17:15:07 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Feb 17 17:15:07 compute-0 systemd[1]: session-14.scope: Consumed 30.631s CPU time.
Feb 17 17:15:07 compute-0 systemd-logind[806]: Session 14 logged out. Waiting for processes to exit.
Feb 17 17:15:07 compute-0 systemd-logind[806]: Removed session 14.
Feb 17 17:15:12 compute-0 sshd-session[75165]: Accepted publickey for zuul from 192.168.122.30 port 39686 ssh2: ECDSA SHA256:+U7vSwQZvFvLlKGEfHWMlLBUV1LwgLTIxJVbiTi7h3A
Feb 17 17:15:12 compute-0 systemd-logind[806]: New session 15 of user zuul.
Feb 17 17:15:12 compute-0 systemd[1]: Started Session 15 of User zuul.
Feb 17 17:15:12 compute-0 sshd-session[75165]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 17:15:14 compute-0 sudo[75318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dicnqzjetuiwilcruffpaziajlffelwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348512.683751-16-262912016469647/AnsiballZ_tempfile.py'
Feb 17 17:15:14 compute-0 sudo[75318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:14 compute-0 python3.9[75321]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Feb 17 17:15:14 compute-0 sudo[75318]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:15 compute-0 sudo[75471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dervwomrijobqplnorjijvvozvpjiemf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348514.9669826-28-123390778226244/AnsiballZ_stat.py'
Feb 17 17:15:15 compute-0 sudo[75471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:15 compute-0 python3.9[75474]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:15:15 compute-0 sudo[75471]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:16 compute-0 sudo[75624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlvdxeyvdivxdvvafifrvgfyyqjdxqre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348515.7489128-38-215365786678657/AnsiballZ_setup.py'
Feb 17 17:15:16 compute-0 sudo[75624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:16 compute-0 python3.9[75627]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:15:16 compute-0 sudo[75624]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:17 compute-0 sudo[75777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anzeygppfgiwfifxdfoqapiuzksnntbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348516.7690704-47-8348180798774/AnsiballZ_blockinfile.py'
Feb 17 17:15:17 compute-0 sudo[75777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:17 compute-0 python3.9[75780]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCuEu9UsOXSFGjObDuyZ0UTt1ZpvABavE2f11dNVNtWDa8JQfAHR6AXQ86PJA8poxSrj3J+R3ES2I/5H6hqWVK74ycdyinTclQK0vsRrej6u7m04OxPTZqioH/uU7fj5IToYApa5lr56UgZD4z8hLLPJFOEV6DPEceEgxIzUf1VGdxhQC3dm5CrRDOztjNtM1kiMGFEyCQvdcPyTPa5GoB7NHT3GoYLC6aJtWSw8RTJ+Dp8ZG/x4p6utbBYSQnMeEWt21C/Pv5iHIxsNkNfsphXQRLY7tuzdaRExoBhthnc7YXF00SfSPLltEeoVBHr2kFJjMwWfGO9gzFmaE/bngtrzhRUJbWZO5aijlDPgLFXEjaAHiQGHoJdKsYF6qbhk+w2hH+wC3PQ3gSxJyg/tyqTz5klRSrW181QXF+bYfTwd3I7MnUGznL2znTSB1JD6A9Evdm8CIcSiup/4PNpMWG2L8j2m8TpUhfrQdiVML0J6gSOmGxuNtGT/IkmNu7WQHs=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJGTxkZ+z0JcdLCVJwMTS8l084QI2GN9IpNoBc3xaZWT
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLWFNATgYty+vgjnOx/M4wPp406S3XUR02uk+XAXStLwDpZZrww/E5YTG2gjFTWDtRAG5obj1Q3bxaAxCcmdBVY=
                                             create=True mode=0644 path=/tmp/ansible.83bwuv_6 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:15:17 compute-0 sudo[75777]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:17 compute-0 sudo[75930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bovmextyaymjodjpfbkbghhxdawbjonc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348517.4934194-55-8577282133447/AnsiballZ_command.py'
Feb 17 17:15:17 compute-0 sudo[75930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:18 compute-0 python3.9[75933]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.83bwuv_6' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:15:18 compute-0 sudo[75930]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:18 compute-0 sudo[76085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsmmzzckgkfhxnjvqakkyrjjiqlqfqzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348518.2400737-63-207402840623297/AnsiballZ_file.py'
Feb 17 17:15:18 compute-0 sudo[76085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:18 compute-0 python3.9[76088]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.83bwuv_6 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:15:18 compute-0 sudo[76085]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:19 compute-0 sshd-session[75168]: Connection closed by 192.168.122.30 port 39686
Feb 17 17:15:19 compute-0 sshd-session[75165]: pam_unix(sshd:session): session closed for user zuul
Feb 17 17:15:19 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Feb 17 17:15:19 compute-0 systemd[1]: session-15.scope: Consumed 3.126s CPU time.
Feb 17 17:15:19 compute-0 systemd-logind[806]: Session 15 logged out. Waiting for processes to exit.
Feb 17 17:15:19 compute-0 systemd-logind[806]: Removed session 15.
Feb 17 17:15:20 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 17 17:15:23 compute-0 sshd-session[76115]: Accepted publickey for zuul from 192.168.122.30 port 42542 ssh2: ECDSA SHA256:+U7vSwQZvFvLlKGEfHWMlLBUV1LwgLTIxJVbiTi7h3A
Feb 17 17:15:23 compute-0 systemd-logind[806]: New session 16 of user zuul.
Feb 17 17:15:23 compute-0 systemd[1]: Started Session 16 of User zuul.
Feb 17 17:15:23 compute-0 sshd-session[76115]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 17:15:24 compute-0 python3.9[76268]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:15:25 compute-0 sudo[76422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojbgglttzjlakvkwmybouofvmenvlzuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348524.9500012-27-247854262656390/AnsiballZ_systemd.py'
Feb 17 17:15:25 compute-0 sudo[76422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:25 compute-0 python3.9[76425]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 17 17:15:25 compute-0 sudo[76422]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:26 compute-0 sudo[76577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbclrgavbiuvuwieonixcwzhvqrvtvkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348525.9848335-35-5268139697595/AnsiballZ_systemd.py'
Feb 17 17:15:26 compute-0 sudo[76577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:26 compute-0 python3.9[76580]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 17 17:15:26 compute-0 sudo[76577]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:27 compute-0 sudo[76731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkitxiatgxvwmhxfrjvjvrreveahllij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348526.7182586-44-146125541418353/AnsiballZ_command.py'
Feb 17 17:15:27 compute-0 sudo[76731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:27 compute-0 python3.9[76734]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:15:27 compute-0 sudo[76731]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:27 compute-0 sudo[76885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjqikjcxgwykdxqllcqrbaazowwxfhko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348527.4833705-52-119899883787554/AnsiballZ_stat.py'
Feb 17 17:15:27 compute-0 sudo[76885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:28 compute-0 python3.9[76888]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:15:28 compute-0 sudo[76885]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:28 compute-0 sudo[77040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsbytsjccqahjfpsvmswyfklnrnjyxow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348528.2829523-60-3777984422566/AnsiballZ_command.py'
Feb 17 17:15:28 compute-0 sudo[77040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:28 compute-0 python3.9[77043]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:15:28 compute-0 sudo[77040]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:29 compute-0 sudo[77196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuptrqtprhrfinnnivupiydgtdbdjfkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348528.9047725-68-69027734575286/AnsiballZ_file.py'
Feb 17 17:15:29 compute-0 sudo[77196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:29 compute-0 python3.9[77199]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:15:29 compute-0 sudo[77196]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:29 compute-0 sshd-session[76118]: Connection closed by 192.168.122.30 port 42542
Feb 17 17:15:29 compute-0 sshd-session[76115]: pam_unix(sshd:session): session closed for user zuul
Feb 17 17:15:29 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Feb 17 17:15:29 compute-0 systemd[1]: session-16.scope: Consumed 4.017s CPU time.
Feb 17 17:15:29 compute-0 systemd-logind[806]: Session 16 logged out. Waiting for processes to exit.
Feb 17 17:15:29 compute-0 systemd-logind[806]: Removed session 16.
Feb 17 17:15:34 compute-0 sshd-session[77224]: Accepted publickey for zuul from 192.168.122.30 port 39202 ssh2: ECDSA SHA256:+U7vSwQZvFvLlKGEfHWMlLBUV1LwgLTIxJVbiTi7h3A
Feb 17 17:15:34 compute-0 systemd-logind[806]: New session 17 of user zuul.
Feb 17 17:15:34 compute-0 systemd[1]: Started Session 17 of User zuul.
Feb 17 17:15:34 compute-0 sshd-session[77224]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 17:15:35 compute-0 python3.9[77377]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:15:36 compute-0 sudo[77531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akvlcnbjnvmxihbifnbbxftlwilimvke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348536.1173656-29-253210679333290/AnsiballZ_setup.py'
Feb 17 17:15:36 compute-0 sudo[77531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:36 compute-0 python3.9[77534]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 17 17:15:36 compute-0 sudo[77531]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:37 compute-0 sudo[77616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfwvlpznycwmbaforuavcrflpokqpcuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348536.1173656-29-253210679333290/AnsiballZ_dnf.py'
Feb 17 17:15:37 compute-0 sudo[77616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:37 compute-0 python3.9[77619]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 17 17:15:38 compute-0 sudo[77616]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:39 compute-0 python3.9[77770]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:15:40 compute-0 python3.9[77921]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 17 17:15:41 compute-0 python3.9[78071]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:15:41 compute-0 python3.9[78221]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:15:41 compute-0 sshd-session[77227]: Connection closed by 192.168.122.30 port 39202
Feb 17 17:15:41 compute-0 sshd-session[77224]: pam_unix(sshd:session): session closed for user zuul
Feb 17 17:15:41 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Feb 17 17:15:41 compute-0 systemd[1]: session-17.scope: Consumed 5.264s CPU time.
Feb 17 17:15:41 compute-0 systemd-logind[806]: Session 17 logged out. Waiting for processes to exit.
Feb 17 17:15:41 compute-0 systemd-logind[806]: Removed session 17.
Feb 17 17:15:46 compute-0 sshd-session[78246]: Accepted publickey for zuul from 192.168.122.30 port 37578 ssh2: ECDSA SHA256:+U7vSwQZvFvLlKGEfHWMlLBUV1LwgLTIxJVbiTi7h3A
Feb 17 17:15:46 compute-0 systemd-logind[806]: New session 18 of user zuul.
Feb 17 17:15:46 compute-0 systemd[1]: Started Session 18 of User zuul.
Feb 17 17:15:46 compute-0 sshd-session[78246]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 17:15:47 compute-0 python3.9[78399]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:15:49 compute-0 sudo[78553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npkildgxwdtpcqvvbjojrjmrscaalfon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348548.6955214-45-53844237606429/AnsiballZ_file.py'
Feb 17 17:15:49 compute-0 sudo[78553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:49 compute-0 python3.9[78556]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:15:49 compute-0 sudo[78553]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:49 compute-0 sudo[78706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buyqvepmtsobpimalwrtdyfnnxddftal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348549.3967657-45-44950676668043/AnsiballZ_file.py'
Feb 17 17:15:49 compute-0 sudo[78706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:49 compute-0 python3.9[78709]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:15:49 compute-0 sudo[78706]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:50 compute-0 sudo[78859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyarcmgklibvazqmhyubdatqcrviynbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348549.9847784-60-185896062436587/AnsiballZ_stat.py'
Feb 17 17:15:50 compute-0 sudo[78859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:50 compute-0 python3.9[78862]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:15:50 compute-0 sudo[78859]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:51 compute-0 sudo[78983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrlzlfptylglhnufpfcotfhuikekjtzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348549.9847784-60-185896062436587/AnsiballZ_copy.py'
Feb 17 17:15:51 compute-0 sudo[78983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:51 compute-0 python3.9[78986]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348549.9847784-60-185896062436587/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=05cde288cb45827b4c194ebeeff2a06873e48cd1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:15:51 compute-0 sudo[78983]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:51 compute-0 sudo[79136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzryhpiqyfrhueqhnukugsucptyvyclp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348551.4470947-60-39338768895377/AnsiballZ_stat.py'
Feb 17 17:15:51 compute-0 sudo[79136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:51 compute-0 python3.9[79139]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:15:51 compute-0 sudo[79136]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:52 compute-0 sudo[79260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqbmnzcqlhftdiiikhdbkpzthagkzhdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348551.4470947-60-39338768895377/AnsiballZ_copy.py'
Feb 17 17:15:52 compute-0 sudo[79260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:52 compute-0 python3.9[79263]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348551.4470947-60-39338768895377/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=957c684be69041cef22f0396dafb10bfc3328165 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:15:52 compute-0 sudo[79260]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:52 compute-0 sudo[79413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjbrdtaheqygenmjblikzadyuksgxzrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348552.567124-60-158832399010040/AnsiballZ_stat.py'
Feb 17 17:15:52 compute-0 sudo[79413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:52 compute-0 python3.9[79416]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:15:53 compute-0 sudo[79413]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:53 compute-0 sudo[79537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tghlnguecdnndiihfnmeoxlvorqafobm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348552.567124-60-158832399010040/AnsiballZ_copy.py'
Feb 17 17:15:53 compute-0 sudo[79537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:53 compute-0 python3.9[79540]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348552.567124-60-158832399010040/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=dc188addaa5adce510e7cb753cfc3ab442a90af4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:15:53 compute-0 sudo[79537]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:53 compute-0 sudo[79690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhraaraadihjszzbbeaaurekljcudzms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348553.6793005-104-235324413598735/AnsiballZ_file.py'
Feb 17 17:15:53 compute-0 sudo[79690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:54 compute-0 python3.9[79693]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:15:54 compute-0 sudo[79690]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:54 compute-0 sudo[79843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbivyasexbbfjbdjvtsupjnahyelldlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348554.2152972-104-38998650444019/AnsiballZ_file.py'
Feb 17 17:15:54 compute-0 sudo[79843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:54 compute-0 python3.9[79846]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:15:54 compute-0 sudo[79843]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:55 compute-0 sudo[79996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yccjgpwaieifkbltwrohcwmdjqvzkyne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348554.8053367-119-194794745710579/AnsiballZ_stat.py'
Feb 17 17:15:55 compute-0 sudo[79996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:55 compute-0 python3.9[79999]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:15:55 compute-0 sudo[79996]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:55 compute-0 sudo[80120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjoihyuebxqncwgnptisblhdxkhbffdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348554.8053367-119-194794745710579/AnsiballZ_copy.py'
Feb 17 17:15:55 compute-0 sudo[80120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:55 compute-0 python3.9[80123]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348554.8053367-119-194794745710579/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=f7fc3f83e3f6185f7a22dd94131764b3003c5078 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:15:55 compute-0 sudo[80120]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:56 compute-0 sudo[80273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jndnzfingrzawzphecodagypyusquzfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348555.8599958-119-22234869927606/AnsiballZ_stat.py'
Feb 17 17:15:56 compute-0 sudo[80273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:56 compute-0 python3.9[80276]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:15:56 compute-0 sudo[80273]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:56 compute-0 sudo[80397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqfxepfvbmgqnlzacmsfaaieglljfcmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348555.8599958-119-22234869927606/AnsiballZ_copy.py'
Feb 17 17:15:56 compute-0 sudo[80397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:56 compute-0 python3.9[80400]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348555.8599958-119-22234869927606/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=7d57eb66e7f37e2b0b5b752d95ddf428e5f77d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:15:56 compute-0 sudo[80397]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:57 compute-0 sudo[80550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvshmpodhqgpnbqhimoslnwbiuzzebxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348556.905972-119-21206640772375/AnsiballZ_stat.py'
Feb 17 17:15:57 compute-0 sudo[80550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:57 compute-0 python3.9[80553]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:15:57 compute-0 sudo[80550]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:57 compute-0 sudo[80674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdblihmcvcerykqnsaqxombantmbowxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348556.905972-119-21206640772375/AnsiballZ_copy.py'
Feb 17 17:15:57 compute-0 sudo[80674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:57 compute-0 python3.9[80677]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348556.905972-119-21206640772375/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=335b40ab4ae84e3a7fdb6e097c8929f83c10621c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:15:58 compute-0 sudo[80674]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:58 compute-0 sudo[80827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efjenwhatbfrdvviojumukwlldunehli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348558.1769757-163-97663390737885/AnsiballZ_file.py'
Feb 17 17:15:58 compute-0 sudo[80827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:58 compute-0 python3.9[80830]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:15:58 compute-0 sudo[80827]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:59 compute-0 sudo[80980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuancgkzogbwmabstyvzfnxwlwsbrqav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348558.7873778-163-166613535257382/AnsiballZ_file.py'
Feb 17 17:15:59 compute-0 sudo[80980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:59 compute-0 python3.9[80983]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:15:59 compute-0 sudo[80980]: pam_unix(sudo:session): session closed for user root
Feb 17 17:15:59 compute-0 sudo[81133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgvvtmeleypufwhpjhsppjcwbzzjbcnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348559.4523795-178-227039454327334/AnsiballZ_stat.py'
Feb 17 17:15:59 compute-0 sudo[81133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:15:59 compute-0 python3.9[81136]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:15:59 compute-0 sudo[81133]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:00 compute-0 sudo[81257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqyloqtspunryxlhvmgmrxwuljqnekwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348559.4523795-178-227039454327334/AnsiballZ_copy.py'
Feb 17 17:16:00 compute-0 sudo[81257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:00 compute-0 python3.9[81260]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348559.4523795-178-227039454327334/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=8fc49445876035e38b016ec31545275fe63cb73c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:00 compute-0 sudo[81257]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:00 compute-0 sudo[81410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtelthctypceqkywirdtmrtuawbcntys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348560.5746427-178-108088669428357/AnsiballZ_stat.py'
Feb 17 17:16:00 compute-0 sudo[81410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:00 compute-0 python3.9[81413]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:16:00 compute-0 sudo[81410]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:01 compute-0 sudo[81534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mytkshpbbjwcowhfkznmrmdtqnsilbnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348560.5746427-178-108088669428357/AnsiballZ_copy.py'
Feb 17 17:16:01 compute-0 sudo[81534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:01 compute-0 python3.9[81537]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348560.5746427-178-108088669428357/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=bf723a6cf1aacc8d9cf02855da6851fb35d5160b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:01 compute-0 sudo[81534]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:03 compute-0 sudo[81687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdojmoasiwmamhyzlvkgwosbxrzhkuup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348563.0688696-178-72831673970600/AnsiballZ_stat.py'
Feb 17 17:16:03 compute-0 sudo[81687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:03 compute-0 python3.9[81690]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:16:03 compute-0 sudo[81687]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:03 compute-0 sudo[81811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agzsdsctucrshntuksenlqgqznlopeln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348563.0688696-178-72831673970600/AnsiballZ_copy.py'
Feb 17 17:16:03 compute-0 sudo[81811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:03 compute-0 python3.9[81814]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348563.0688696-178-72831673970600/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=29d645d017c8925d6dca200f38b59da466a5774e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:03 compute-0 sudo[81811]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:04 compute-0 sudo[81964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrkwfmciwcyvruupwmvkxrtwtfabiszj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348564.2048986-222-166337127186224/AnsiballZ_file.py'
Feb 17 17:16:04 compute-0 sudo[81964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:04 compute-0 python3.9[81967]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:16:04 compute-0 sudo[81964]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:04 compute-0 sudo[82117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbpovxvtdgtkgqkjjmuiqflrpsbueayn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348564.768955-222-194929092592635/AnsiballZ_file.py'
Feb 17 17:16:04 compute-0 sudo[82117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:05 compute-0 python3.9[82120]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:16:05 compute-0 sudo[82117]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:05 compute-0 sudo[82270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftsatshrbxszaozdagnvpvendxfvohtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348565.3500345-237-171625932239955/AnsiballZ_stat.py'
Feb 17 17:16:05 compute-0 sudo[82270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:05 compute-0 python3.9[82273]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:16:05 compute-0 sudo[82270]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:06 compute-0 sudo[82394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncbtjjbikdbxgknioqcueifsdlzcierx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348565.3500345-237-171625932239955/AnsiballZ_copy.py'
Feb 17 17:16:06 compute-0 sudo[82394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:06 compute-0 python3.9[82397]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348565.3500345-237-171625932239955/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=8a9ba9f5a3aedcd17123ecbaea73688811563435 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:06 compute-0 sudo[82394]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:06 compute-0 sudo[82549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxovyndcmvvvydbbqippaxfktwnhgkda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348566.3502297-237-232950216078676/AnsiballZ_stat.py'
Feb 17 17:16:06 compute-0 sudo[82549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:06 compute-0 python3.9[82552]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:16:06 compute-0 sudo[82549]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:07 compute-0 sudo[82673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sblhowjoiggwutdhwkfldyluypuuqqoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348566.3502297-237-232950216078676/AnsiballZ_copy.py'
Feb 17 17:16:07 compute-0 sudo[82673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:07 compute-0 python3.9[82676]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348566.3502297-237-232950216078676/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=bf723a6cf1aacc8d9cf02855da6851fb35d5160b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:07 compute-0 sudo[82673]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:07 compute-0 sshd-session[82398]: Connection closed by authenticating user root 209.38.233.161 port 57778 [preauth]
Feb 17 17:16:07 compute-0 sudo[82826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niirmqkjwjoucuryiljcrlpbfekvqziw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348567.4050107-237-1515515939881/AnsiballZ_stat.py'
Feb 17 17:16:07 compute-0 sudo[82826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:07 compute-0 python3.9[82829]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:16:07 compute-0 sudo[82826]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:08 compute-0 sudo[82950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uilvhnzqmlbidwozciwekahhcliexnbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348567.4050107-237-1515515939881/AnsiballZ_copy.py'
Feb 17 17:16:08 compute-0 sudo[82950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:08 compute-0 python3.9[82953]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348567.4050107-237-1515515939881/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=f83d5f128d97930c45bb16a8507e0bf060d434de backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:08 compute-0 sudo[82950]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:09 compute-0 sudo[83103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vswyhbuvfpgjjcqcoeekvfivfsduqhib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348569.0442932-297-52122742074137/AnsiballZ_file.py'
Feb 17 17:16:09 compute-0 sudo[83103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:09 compute-0 python3.9[83106]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:16:09 compute-0 sudo[83103]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:09 compute-0 sudo[83256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbjxecaqjxvuvlofmcdaicedrdrpkbhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348569.7531786-305-219176018343456/AnsiballZ_stat.py'
Feb 17 17:16:09 compute-0 sudo[83256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:10 compute-0 python3.9[83259]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:16:10 compute-0 sudo[83256]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:10 compute-0 sudo[83380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxedfdwehctfotfrnctyugiodhjqgtaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348569.7531786-305-219176018343456/AnsiballZ_copy.py'
Feb 17 17:16:10 compute-0 sudo[83380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:10 compute-0 python3.9[83383]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348569.7531786-305-219176018343456/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8de607a0b7a24a3cf424fe58664a4768629b5cf8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:10 compute-0 sudo[83380]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:11 compute-0 sudo[83533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mopkgawvuzrmwwccwvoinftkilvhgssn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348570.8477592-321-10489648423087/AnsiballZ_file.py'
Feb 17 17:16:11 compute-0 sudo[83533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:11 compute-0 python3.9[83536]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:16:11 compute-0 sudo[83533]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:11 compute-0 sudo[83686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rympswegcjkpqedbbuwllppahgfedjbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348571.4749384-329-110863963335691/AnsiballZ_stat.py'
Feb 17 17:16:11 compute-0 sudo[83686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:11 compute-0 python3.9[83689]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:16:11 compute-0 sudo[83686]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:12 compute-0 sudo[83810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkuohicktfjauojanhksxcvwqisxfbez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348571.4749384-329-110863963335691/AnsiballZ_copy.py'
Feb 17 17:16:12 compute-0 sudo[83810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:12 compute-0 python3.9[83813]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348571.4749384-329-110863963335691/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8de607a0b7a24a3cf424fe58664a4768629b5cf8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:12 compute-0 sudo[83810]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:13 compute-0 sudo[83963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcbwmejuuqtcydumcmhxippftoohrwnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348572.7583647-345-1316947220311/AnsiballZ_file.py'
Feb 17 17:16:13 compute-0 sudo[83963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:13 compute-0 python3.9[83966]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:16:13 compute-0 sudo[83963]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:13 compute-0 sudo[84116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykhcfexagoiqtxtphighvgrcwvoygpgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348573.4131887-353-171710298378836/AnsiballZ_stat.py'
Feb 17 17:16:13 compute-0 sudo[84116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:13 compute-0 python3.9[84119]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:16:13 compute-0 sudo[84116]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:14 compute-0 sudo[84240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hanaefkkjmjlbmdwabwxoeprvqlsbxhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348573.4131887-353-171710298378836/AnsiballZ_copy.py'
Feb 17 17:16:14 compute-0 sudo[84240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:14 compute-0 python3.9[84243]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348573.4131887-353-171710298378836/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8de607a0b7a24a3cf424fe58664a4768629b5cf8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:14 compute-0 sudo[84240]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:14 compute-0 sudo[84393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhrcvifjelwarrzcnciisrfpnnulrsaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348574.6342223-369-153705260087651/AnsiballZ_file.py'
Feb 17 17:16:14 compute-0 sudo[84393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:15 compute-0 python3.9[84396]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:16:15 compute-0 sudo[84393]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:15 compute-0 sudo[84546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njpdjqwpoyibkuuipawwutqelsysameg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348575.3138125-377-168772959861823/AnsiballZ_stat.py'
Feb 17 17:16:15 compute-0 sudo[84546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:15 compute-0 python3.9[84549]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:16:15 compute-0 sudo[84546]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:16 compute-0 sudo[84670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypahhhciqmgjbhdrovxrcdkxjpdrffkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348575.3138125-377-168772959861823/AnsiballZ_copy.py'
Feb 17 17:16:16 compute-0 sudo[84670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:16 compute-0 python3.9[84673]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348575.3138125-377-168772959861823/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8de607a0b7a24a3cf424fe58664a4768629b5cf8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:16 compute-0 sudo[84670]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:16 compute-0 sudo[84823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gruqdohikmwbvmguoxtffrsjdctjnrhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348576.5525002-393-142547139738322/AnsiballZ_file.py'
Feb 17 17:16:16 compute-0 sudo[84823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:17 compute-0 python3.9[84826]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:16:17 compute-0 sudo[84823]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:17 compute-0 sudo[84976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iaykxwvseyazhmsbztshsfiuyplbaosr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348577.1951497-401-77220604628008/AnsiballZ_stat.py'
Feb 17 17:16:17 compute-0 sudo[84976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:17 compute-0 python3.9[84979]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:16:17 compute-0 sudo[84976]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:18 compute-0 sudo[85100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnsfeagfztikrwrmottkdhrbutjehcmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348577.1951497-401-77220604628008/AnsiballZ_copy.py'
Feb 17 17:16:18 compute-0 sudo[85100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:18 compute-0 python3.9[85103]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348577.1951497-401-77220604628008/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8de607a0b7a24a3cf424fe58664a4768629b5cf8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:18 compute-0 sudo[85100]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:18 compute-0 sudo[85253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amafqlsepvoudtqktkqywayuslvqjzzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348578.373366-417-112736382431343/AnsiballZ_file.py'
Feb 17 17:16:18 compute-0 sudo[85253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:18 compute-0 python3.9[85256]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:16:18 compute-0 sudo[85253]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:19 compute-0 sudo[85406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbgyknqhjsugwmiznlkvbqmcsccuhefr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348578.9548764-425-245899728338579/AnsiballZ_stat.py'
Feb 17 17:16:19 compute-0 sudo[85406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:19 compute-0 python3.9[85409]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:16:19 compute-0 sudo[85406]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:19 compute-0 sudo[85530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eovzbrzzmpcmuxrcrwtetutrlivvsudm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348578.9548764-425-245899728338579/AnsiballZ_copy.py'
Feb 17 17:16:19 compute-0 sudo[85530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:19 compute-0 python3.9[85533]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348578.9548764-425-245899728338579/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8de607a0b7a24a3cf424fe58664a4768629b5cf8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:19 compute-0 sudo[85530]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:20 compute-0 sudo[85683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unieqnpwjcxffkotfuldmjdjkwxmuiyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348580.0551834-441-80849554385008/AnsiballZ_file.py'
Feb 17 17:16:20 compute-0 sudo[85683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:20 compute-0 python3.9[85686]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:16:20 compute-0 sudo[85683]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:20 compute-0 sudo[85836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqoovohvcqlanqsknyfqbrfymiujynjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348580.621409-449-163571461621935/AnsiballZ_stat.py'
Feb 17 17:16:20 compute-0 sudo[85836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:21 compute-0 python3.9[85839]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:16:21 compute-0 sudo[85836]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:21 compute-0 sudo[85960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idtlxgfzrtwtsobgqwswcpdasjvciasm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348580.621409-449-163571461621935/AnsiballZ_copy.py'
Feb 17 17:16:21 compute-0 sudo[85960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:21 compute-0 python3.9[85963]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348580.621409-449-163571461621935/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=8de607a0b7a24a3cf424fe58664a4768629b5cf8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:21 compute-0 sudo[85960]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:22 compute-0 sshd-session[78249]: Connection closed by 192.168.122.30 port 37578
Feb 17 17:16:22 compute-0 sshd-session[78246]: pam_unix(sshd:session): session closed for user zuul
Feb 17 17:16:22 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Feb 17 17:16:22 compute-0 systemd[1]: session-18.scope: Consumed 25.638s CPU time.
Feb 17 17:16:22 compute-0 systemd-logind[806]: Session 18 logged out. Waiting for processes to exit.
Feb 17 17:16:22 compute-0 systemd-logind[806]: Removed session 18.
Feb 17 17:16:26 compute-0 sshd-session[85988]: Accepted publickey for zuul from 192.168.122.30 port 46028 ssh2: ECDSA SHA256:+U7vSwQZvFvLlKGEfHWMlLBUV1LwgLTIxJVbiTi7h3A
Feb 17 17:16:26 compute-0 systemd-logind[806]: New session 19 of user zuul.
Feb 17 17:16:26 compute-0 systemd[1]: Started Session 19 of User zuul.
Feb 17 17:16:26 compute-0 sshd-session[85988]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 17:16:27 compute-0 chronyd[65849]: Selected source 158.69.193.108 (pool.ntp.org)
Feb 17 17:16:27 compute-0 python3.9[86141]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:16:28 compute-0 sudo[86295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvoedbghyricmnkcchvclgsnmwdwgudt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348588.341628-29-42623850977948/AnsiballZ_file.py'
Feb 17 17:16:28 compute-0 sudo[86295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:28 compute-0 python3.9[86298]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:16:28 compute-0 sudo[86295]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:29 compute-0 sudo[86448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fohumyscxyszrwtcdillggbylkorljxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348589.094269-29-87654135641368/AnsiballZ_file.py'
Feb 17 17:16:29 compute-0 sudo[86448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:29 compute-0 python3.9[86451]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:16:29 compute-0 sudo[86448]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:30 compute-0 python3.9[86601]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:16:30 compute-0 sudo[86751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xahbdnjlxwbclrqvlrojalutvgxeotiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348590.334329-52-79738964667081/AnsiballZ_seboolean.py'
Feb 17 17:16:30 compute-0 sudo[86751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:30 compute-0 python3.9[86754]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 17 17:16:33 compute-0 sudo[86751]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:33 compute-0 sudo[86908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqlrghdigitaiipqdkaddhrmghwimmhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348593.6468997-62-254884386847726/AnsiballZ_setup.py'
Feb 17 17:16:33 compute-0 dbus-broker-launch[789]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Feb 17 17:16:33 compute-0 sudo[86908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:34 compute-0 python3.9[86911]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 17 17:16:34 compute-0 sudo[86908]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:34 compute-0 sudo[86993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usuptnmcbeihprusjggxpngbsogjpwiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348593.6468997-62-254884386847726/AnsiballZ_dnf.py'
Feb 17 17:16:34 compute-0 sudo[86993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:35 compute-0 python3.9[86996]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 17 17:16:36 compute-0 sudo[86993]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:37 compute-0 sudo[87147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wycgmszzvomfswlbwclkaweaggkzavwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348596.5661676-74-10809821163240/AnsiballZ_systemd.py'
Feb 17 17:16:37 compute-0 sudo[87147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:37 compute-0 python3.9[87150]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 17 17:16:37 compute-0 sudo[87147]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:38 compute-0 sudo[87303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-robscrodrcvexazpmwpwlgzqwihstxqh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771348597.691009-82-115083017708724/AnsiballZ_edpm_nftables_snippet.py'
Feb 17 17:16:38 compute-0 sudo[87303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:38 compute-0 python3[87306]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Feb 17 17:16:38 compute-0 sudo[87303]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:38 compute-0 sudo[87456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnkijwopvyfupitcdtgzbauxxikuykrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348598.5344815-91-127304982167889/AnsiballZ_file.py'
Feb 17 17:16:38 compute-0 sudo[87456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:38 compute-0 python3.9[87459]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:38 compute-0 sudo[87456]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:39 compute-0 sudo[87609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zemmqefydymycllecxyfevyiwqtttuyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348599.0772889-99-26548369894595/AnsiballZ_stat.py'
Feb 17 17:16:39 compute-0 sudo[87609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:39 compute-0 python3.9[87612]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:16:39 compute-0 sudo[87609]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:39 compute-0 sudo[87688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvidtsevzarbyxyddvysnnfazejauiuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348599.0772889-99-26548369894595/AnsiballZ_file.py'
Feb 17 17:16:39 compute-0 sudo[87688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:40 compute-0 python3.9[87691]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:40 compute-0 sudo[87688]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:40 compute-0 sudo[87841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukrnhccjduofggziitfahigsmeukehfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348600.3117094-111-260489945932523/AnsiballZ_stat.py'
Feb 17 17:16:40 compute-0 sudo[87841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:40 compute-0 python3.9[87844]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:16:40 compute-0 sudo[87841]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:40 compute-0 sudo[87920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbgmblnaejgqytvkioapzmxtgywqhcgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348600.3117094-111-260489945932523/AnsiballZ_file.py'
Feb 17 17:16:40 compute-0 sudo[87920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:41 compute-0 python3.9[87923]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.4f08d3s1 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:41 compute-0 sudo[87920]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:41 compute-0 sudo[88073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzwgozanienohbgcmpsdzlnpzbmmkqws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348601.284842-123-222908517298706/AnsiballZ_stat.py'
Feb 17 17:16:41 compute-0 sudo[88073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:41 compute-0 python3.9[88076]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:16:41 compute-0 sudo[88073]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:41 compute-0 sudo[88152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovfahdvbhiozcuamikfvfhyglcqqwdiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348601.284842-123-222908517298706/AnsiballZ_file.py'
Feb 17 17:16:41 compute-0 sudo[88152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:42 compute-0 python3.9[88155]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:42 compute-0 sudo[88152]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:42 compute-0 sudo[88305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbfphkwowyywgjdarbgyqgkarcbpitfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348602.2680147-136-209476206303732/AnsiballZ_command.py'
Feb 17 17:16:42 compute-0 sudo[88305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:42 compute-0 python3.9[88308]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:16:42 compute-0 sudo[88305]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:43 compute-0 sudo[88459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyxctukleohansesnntnuiipamshecpn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771348602.9929373-144-179914160645129/AnsiballZ_edpm_nftables_from_files.py'
Feb 17 17:16:43 compute-0 sudo[88459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:43 compute-0 python3[88462]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 17 17:16:43 compute-0 sudo[88459]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:43 compute-0 sudo[88612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buwsbhpiugyhwzdautvhvppnaqzbqdae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348603.7229364-152-42309802479744/AnsiballZ_stat.py'
Feb 17 17:16:43 compute-0 sudo[88612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:44 compute-0 python3.9[88615]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:16:44 compute-0 sudo[88612]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:44 compute-0 sudo[88738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afpkjduujxleutbgkqqyjdkdzejbgqzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348603.7229364-152-42309802479744/AnsiballZ_copy.py'
Feb 17 17:16:44 compute-0 sudo[88738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:44 compute-0 python3.9[88741]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348603.7229364-152-42309802479744/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:44 compute-0 sudo[88738]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:45 compute-0 sudo[88891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gadnzewqsllhaihipazgilbjkkueasdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348605.0333703-167-55906073692591/AnsiballZ_stat.py'
Feb 17 17:16:45 compute-0 sudo[88891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:45 compute-0 python3.9[88894]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:16:45 compute-0 sudo[88891]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:45 compute-0 sudo[89017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tttivryrsikuhfbulpofbtovuxgcpnjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348605.0333703-167-55906073692591/AnsiballZ_copy.py'
Feb 17 17:16:45 compute-0 sudo[89017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:46 compute-0 python3.9[89020]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348605.0333703-167-55906073692591/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:46 compute-0 sudo[89017]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:46 compute-0 sudo[89170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxihphwvasngqtkcoljzhqqjovqfvwxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348606.3212306-182-15533542087467/AnsiballZ_stat.py'
Feb 17 17:16:46 compute-0 sudo[89170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:46 compute-0 python3.9[89173]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:16:46 compute-0 sudo[89170]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:47 compute-0 sudo[89296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syzmpxxouncowuayzngvxagqytgmdgdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348606.3212306-182-15533542087467/AnsiballZ_copy.py'
Feb 17 17:16:47 compute-0 sudo[89296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:47 compute-0 python3.9[89299]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348606.3212306-182-15533542087467/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:47 compute-0 sudo[89296]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:47 compute-0 sudo[89449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwgwxhmfhuoivingbxdczbsbrnnonyjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348607.4359357-197-2635566071488/AnsiballZ_stat.py'
Feb 17 17:16:47 compute-0 sudo[89449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:47 compute-0 python3.9[89452]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:16:47 compute-0 sudo[89449]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:48 compute-0 sudo[89575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnnrynnxoporeenurnorvxuezdstyxhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348607.4359357-197-2635566071488/AnsiballZ_copy.py'
Feb 17 17:16:48 compute-0 sudo[89575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:48 compute-0 python3.9[89578]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348607.4359357-197-2635566071488/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:48 compute-0 sudo[89575]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:48 compute-0 sudo[89728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykekdhtgvcmpogcwogwwoiamjivloctd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348608.5045712-212-46632247166438/AnsiballZ_stat.py'
Feb 17 17:16:48 compute-0 sudo[89728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:49 compute-0 python3.9[89731]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:16:49 compute-0 sudo[89728]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:49 compute-0 sudo[89854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrgrgrskvmamefrzyxzsvimqagghpdhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348608.5045712-212-46632247166438/AnsiballZ_copy.py'
Feb 17 17:16:49 compute-0 sudo[89854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:49 compute-0 python3.9[89857]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348608.5045712-212-46632247166438/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:49 compute-0 sudo[89854]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:49 compute-0 sudo[90007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlsxsatuayybmyyxxwdewerhzhizkjgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348609.6743517-227-32761801568856/AnsiballZ_file.py'
Feb 17 17:16:49 compute-0 sudo[90007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:50 compute-0 python3.9[90010]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:50 compute-0 sudo[90007]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:50 compute-0 sudo[90160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvnwsaellzlotmurfdgkcujjhvyonbtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348610.297695-235-31539277997078/AnsiballZ_command.py'
Feb 17 17:16:50 compute-0 sudo[90160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:50 compute-0 python3.9[90163]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:16:50 compute-0 sudo[90160]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:51 compute-0 sudo[90316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvwvcahrzayoebqftlztjrsvubpopxhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348610.9543343-243-122174729098933/AnsiballZ_blockinfile.py'
Feb 17 17:16:51 compute-0 sudo[90316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:51 compute-0 python3.9[90319]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:51 compute-0 sudo[90316]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:51 compute-0 sudo[90469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chiryyfrbuxofpknhacagtmqtyaefhgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348611.7230413-252-127211924035702/AnsiballZ_command.py'
Feb 17 17:16:51 compute-0 sudo[90469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:52 compute-0 python3.9[90472]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:16:52 compute-0 sudo[90469]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:52 compute-0 sudo[90623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jttkdcsvehdsoaybwwmsmyxtgxizaxwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348612.3375025-260-61119514892128/AnsiballZ_stat.py'
Feb 17 17:16:52 compute-0 sudo[90623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:52 compute-0 python3.9[90626]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:16:52 compute-0 sudo[90623]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:53 compute-0 sudo[90778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xksqplacuysdfymdpdeslqplnrgagfzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348612.957723-268-157685110930390/AnsiballZ_command.py'
Feb 17 17:16:53 compute-0 sudo[90778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:53 compute-0 python3.9[90781]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:16:53 compute-0 sudo[90778]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:53 compute-0 sudo[90934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avqigqcxypeigiwagjulkppksnwfnmik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348613.5780177-276-68647987599619/AnsiballZ_file.py'
Feb 17 17:16:53 compute-0 sudo[90934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:53 compute-0 python3.9[90937]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:16:53 compute-0 sudo[90934]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:54 compute-0 python3.9[91087]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:16:55 compute-0 sudo[91238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afygqxalifncardoirrvzuplrxxbhwdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348615.5158064-317-90180274810211/AnsiballZ_command.py'
Feb 17 17:16:55 compute-0 sudo[91238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:55 compute-0 python3.9[91241]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:2f:db:26:37" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:16:55 compute-0 ovs-vsctl[91242]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:2f:db:26:37 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Feb 17 17:16:55 compute-0 sudo[91238]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:56 compute-0 sudo[91392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhlfjcrxmzygwrdzwlflaffnsnbtfpra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348616.1647532-326-157644811748787/AnsiballZ_command.py'
Feb 17 17:16:56 compute-0 sudo[91392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:56 compute-0 python3.9[91395]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:16:56 compute-0 sudo[91392]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:57 compute-0 sudo[91548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbwiniwvxfsulzjfgchchpveslyjjxqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348616.8425558-334-142950177199088/AnsiballZ_command.py'
Feb 17 17:16:57 compute-0 sudo[91548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:57 compute-0 python3.9[91551]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:16:57 compute-0 ovs-vsctl[91552]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Feb 17 17:16:57 compute-0 sudo[91548]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:57 compute-0 python3.9[91702]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:16:58 compute-0 sudo[91854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqfkdladygnmjhclzxcorogfatyedgzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348618.1085353-351-132790460095954/AnsiballZ_file.py'
Feb 17 17:16:58 compute-0 sudo[91854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:58 compute-0 python3.9[91857]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:16:58 compute-0 sudo[91854]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:59 compute-0 sudo[92007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owrskrbddbewwmwvgtiktencpmyeqnln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348618.763403-359-53835764107986/AnsiballZ_stat.py'
Feb 17 17:16:59 compute-0 sudo[92007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:59 compute-0 python3.9[92010]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:16:59 compute-0 sudo[92007]: pam_unix(sudo:session): session closed for user root
Feb 17 17:16:59 compute-0 sudo[92086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oddfjgbfydfgoorlvslukjczffmgcrsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348618.763403-359-53835764107986/AnsiballZ_file.py'
Feb 17 17:16:59 compute-0 sudo[92086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:16:59 compute-0 python3.9[92089]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:16:59 compute-0 sudo[92086]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:00 compute-0 sudo[92239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrftrwleeqixdxvstbjkhxjabuajocei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348619.8802886-359-228749082267863/AnsiballZ_stat.py'
Feb 17 17:17:00 compute-0 sudo[92239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:00 compute-0 python3.9[92242]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:17:00 compute-0 sudo[92239]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:00 compute-0 sudo[92318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvoffyqzwhqnlklqdoqmacjrxyoyqdyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348619.8802886-359-228749082267863/AnsiballZ_file.py'
Feb 17 17:17:00 compute-0 sudo[92318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:00 compute-0 python3.9[92321]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:17:00 compute-0 sudo[92318]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:01 compute-0 sudo[92471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lswmsarljhtmqcmgrepbuvrnofvhcfom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348620.8665757-382-41779262037998/AnsiballZ_file.py'
Feb 17 17:17:01 compute-0 sudo[92471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:01 compute-0 python3.9[92474]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:17:01 compute-0 sudo[92471]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:01 compute-0 sudo[92624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmagjwvlpyqujybrgynzfsfknekvoqjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348621.4801862-390-59221964075344/AnsiballZ_stat.py'
Feb 17 17:17:01 compute-0 sudo[92624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:01 compute-0 python3.9[92627]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:17:01 compute-0 sudo[92624]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:02 compute-0 sudo[92703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhoqwlleottnyzdagvooxkjydoepqseg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348621.4801862-390-59221964075344/AnsiballZ_file.py'
Feb 17 17:17:02 compute-0 sudo[92703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:02 compute-0 python3.9[92706]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:17:02 compute-0 sudo[92703]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:02 compute-0 sudo[92856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yeoogxdppcatbvnndjgrcnhoothyxwbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348622.4942775-402-154279442459137/AnsiballZ_stat.py'
Feb 17 17:17:02 compute-0 sudo[92856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:02 compute-0 python3.9[92859]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:17:02 compute-0 sudo[92856]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:03 compute-0 sudo[92935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cosbqiqzlbcpleaerjymbrmcissbirpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348622.4942775-402-154279442459137/AnsiballZ_file.py'
Feb 17 17:17:03 compute-0 sudo[92935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:03 compute-0 python3.9[92938]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:17:03 compute-0 sudo[92935]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:03 compute-0 sudo[93088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyuhlpikgdefmeojiezwguxvrjrjulux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348623.4509678-414-167784930832082/AnsiballZ_systemd.py'
Feb 17 17:17:03 compute-0 sudo[93088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:04 compute-0 python3.9[93091]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:17:04 compute-0 systemd[1]: Reloading.
Feb 17 17:17:04 compute-0 systemd-rc-local-generator[93113]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:17:04 compute-0 systemd-sysv-generator[93123]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:17:04 compute-0 sudo[93088]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:04 compute-0 sudo[93285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwkwrpyyguxiyglphispjytfqhlsopki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348624.4289885-422-237366100955213/AnsiballZ_stat.py'
Feb 17 17:17:04 compute-0 sudo[93285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:04 compute-0 python3.9[93288]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:17:04 compute-0 sudo[93285]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:04 compute-0 sudo[93364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyyfhfcynjuezooaujsdfeiuukrdzuxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348624.4289885-422-237366100955213/AnsiballZ_file.py'
Feb 17 17:17:04 compute-0 sudo[93364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:05 compute-0 python3.9[93367]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:17:05 compute-0 sudo[93364]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:05 compute-0 sudo[93517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhnnhvwisjkvqcgenvrqqvlifvipzwpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348625.3302898-434-155230405171453/AnsiballZ_stat.py'
Feb 17 17:17:05 compute-0 sudo[93517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:05 compute-0 python3.9[93520]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:17:05 compute-0 sudo[93517]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:05 compute-0 sudo[93596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxmspmsdgivvwypmxewbbtzgneuesxrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348625.3302898-434-155230405171453/AnsiballZ_file.py'
Feb 17 17:17:05 compute-0 sudo[93596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:06 compute-0 python3.9[93599]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:17:06 compute-0 sudo[93596]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:06 compute-0 sudo[93749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zaskdqmgmpkpkybzqcyiuostnccpnehj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348626.2933197-446-253782535221645/AnsiballZ_systemd.py'
Feb 17 17:17:06 compute-0 sudo[93749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:06 compute-0 python3.9[93752]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:17:06 compute-0 systemd[1]: Reloading.
Feb 17 17:17:06 compute-0 systemd-rc-local-generator[93777]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:17:06 compute-0 systemd-sysv-generator[93783]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:17:07 compute-0 systemd[1]: Starting Create netns directory...
Feb 17 17:17:07 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 17 17:17:07 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 17 17:17:07 compute-0 systemd[1]: Finished Create netns directory.
Feb 17 17:17:07 compute-0 sudo[93749]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:07 compute-0 sudo[93951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezqbjedauovjcwefbiqrywvsecjwzgcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348627.3063445-456-212366551145251/AnsiballZ_file.py'
Feb 17 17:17:07 compute-0 sudo[93951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:07 compute-0 python3.9[93954]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:17:07 compute-0 sudo[93951]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:08 compute-0 sudo[94104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfvisjrkprcbsgmvfdkytawneuolzdvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348627.8799636-464-116807016452496/AnsiballZ_stat.py'
Feb 17 17:17:08 compute-0 sudo[94104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:08 compute-0 python3.9[94107]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:17:08 compute-0 sudo[94104]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:08 compute-0 sudo[94228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukpfqmiwibwcorhzuhpnmbyuhbenzuxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348627.8799636-464-116807016452496/AnsiballZ_copy.py'
Feb 17 17:17:08 compute-0 sudo[94228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:08 compute-0 python3.9[94231]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771348627.8799636-464-116807016452496/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:17:08 compute-0 sudo[94228]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:09 compute-0 sudo[94381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsbfthqeywtlxlkfxwvtbqmvwkleagxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348629.0738022-481-153351384283299/AnsiballZ_file.py'
Feb 17 17:17:09 compute-0 sudo[94381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:09 compute-0 python3.9[94384]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:17:09 compute-0 sudo[94381]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:09 compute-0 sudo[94534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usfcinicpkycorqtprhfpmoxtfdshcpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348629.6395586-489-47288495193998/AnsiballZ_file.py'
Feb 17 17:17:09 compute-0 sudo[94534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:10 compute-0 python3.9[94537]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:17:10 compute-0 sudo[94534]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:10 compute-0 sudo[94687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoedjrafzuogoiyhjfyecuxpydhynsti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348630.240652-497-257575712410575/AnsiballZ_stat.py'
Feb 17 17:17:10 compute-0 sudo[94687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:10 compute-0 python3.9[94690]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:17:10 compute-0 sudo[94687]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:10 compute-0 sudo[94811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hecqaavmmidyrwbuxuchpqvvvqiplshn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348630.240652-497-257575712410575/AnsiballZ_copy.py'
Feb 17 17:17:10 compute-0 sudo[94811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:11 compute-0 python3.9[94814]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771348630.240652-497-257575712410575/.source.json _original_basename=.wppoa29d follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:17:11 compute-0 sudo[94811]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:11 compute-0 python3.9[94964]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:17:13 compute-0 sudo[95385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgxmnhgcikvrlqwxtaakluolfwnsjbse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348633.2171383-537-64016516413726/AnsiballZ_container_config_data.py'
Feb 17 17:17:13 compute-0 sudo[95385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:13 compute-0 python3.9[95388]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Feb 17 17:17:13 compute-0 sudo[95385]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:14 compute-0 sudo[95538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omfkcqljzgvdwwmnwsgpxqzwsphizqmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348634.0488718-548-51071832908410/AnsiballZ_container_config_hash.py'
Feb 17 17:17:14 compute-0 sudo[95538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:14 compute-0 python3.9[95541]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 17 17:17:14 compute-0 sudo[95538]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:15 compute-0 sudo[95691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxbgqdvtfhxsfzmmqhiwvhpcndhtaubj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771348635.123002-558-160144867029848/AnsiballZ_edpm_container_manage.py'
Feb 17 17:17:15 compute-0 sudo[95691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:15 compute-0 python3[95694]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Feb 17 17:17:15 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 17 17:17:15 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 17 17:17:15 compute-0 podman[95731]: 2026-02-17 17:17:15.935880027 +0000 UTC m=+0.042892670 container create 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 17 17:17:15 compute-0 podman[95731]: 2026-02-17 17:17:15.911363393 +0000 UTC m=+0.018376046 image pull 9f8c6308802db66f6c1100257e3fa9593740e85d82f038b4185cf756493dc94e quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 17 17:17:15 compute-0 python3[95694]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 17 17:17:16 compute-0 sudo[95691]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:16 compute-0 sudo[95920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzwejlefkqvxbfhuexarvkfoevamxgly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348636.1678684-566-148321842488814/AnsiballZ_stat.py'
Feb 17 17:17:16 compute-0 sudo[95920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:16 compute-0 python3.9[95923]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:17:16 compute-0 sudo[95920]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 17 17:17:17 compute-0 sudo[96075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taywsgxkuvqwpptzjoafiulngdqhqlgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348636.8010986-575-107942681686952/AnsiballZ_file.py'
Feb 17 17:17:17 compute-0 sudo[96075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:17 compute-0 python3.9[96078]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:17:17 compute-0 sudo[96075]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:17 compute-0 sudo[96152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebkrjeovyzitthtagwhofleokfdncnud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348636.8010986-575-107942681686952/AnsiballZ_stat.py'
Feb 17 17:17:17 compute-0 sudo[96152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:17 compute-0 python3.9[96155]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:17:17 compute-0 sudo[96152]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:18 compute-0 sudo[96304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vebkltrpyfeaoiblyysdpcnzdlxdrerd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348637.622044-575-244264417724131/AnsiballZ_copy.py'
Feb 17 17:17:18 compute-0 sudo[96304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:18 compute-0 python3.9[96307]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771348637.622044-575-244264417724131/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:17:18 compute-0 sudo[96304]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:18 compute-0 sudo[96381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwkiwszybaolxdlhptvupezafiqbtacx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348637.622044-575-244264417724131/AnsiballZ_systemd.py'
Feb 17 17:17:18 compute-0 sudo[96381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:18 compute-0 python3.9[96384]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 17 17:17:18 compute-0 systemd[1]: Reloading.
Feb 17 17:17:18 compute-0 systemd-rc-local-generator[96413]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:17:18 compute-0 systemd-sysv-generator[96417]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:17:18 compute-0 sudo[96381]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:19 compute-0 sudo[96502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-almthetblczkfrngyosujkvvbntwrtey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348637.622044-575-244264417724131/AnsiballZ_systemd.py'
Feb 17 17:17:19 compute-0 sudo[96502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:19 compute-0 sshd-session[96385]: Connection closed by authenticating user root 209.38.233.161 port 43132 [preauth]
Feb 17 17:17:19 compute-0 python3.9[96505]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:17:19 compute-0 systemd[1]: Reloading.
Feb 17 17:17:19 compute-0 systemd-rc-local-generator[96534]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:17:19 compute-0 systemd-sysv-generator[96537]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:17:19 compute-0 systemd[1]: Starting ovn_controller container...
Feb 17 17:17:19 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Feb 17 17:17:19 compute-0 systemd[1]: Started libcrun container.
Feb 17 17:17:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23b0a16e6d9e800da4715eaf34817e7d5397fbde6582ce7d9255e2fe3028b995/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 17 17:17:19 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75.
Feb 17 17:17:19 compute-0 podman[96553]: 2026-02-17 17:17:19.87342514 +0000 UTC m=+0.118355286 container init 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 17 17:17:19 compute-0 ovn_controller[96568]: + sudo -E kolla_set_configs
Feb 17 17:17:19 compute-0 podman[96553]: 2026-02-17 17:17:19.901294845 +0000 UTC m=+0.146224911 container start 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 17 17:17:19 compute-0 edpm-start-podman-container[96553]: ovn_controller
Feb 17 17:17:19 compute-0 systemd[1]: Created slice User Slice of UID 0.
Feb 17 17:17:19 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 17 17:17:19 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 17 17:17:19 compute-0 systemd[1]: Starting User Manager for UID 0...
Feb 17 17:17:19 compute-0 edpm-start-podman-container[96552]: Creating additional drop-in dependency for "ovn_controller" (96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75)
Feb 17 17:17:19 compute-0 systemd[96608]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Feb 17 17:17:19 compute-0 podman[96574]: 2026-02-17 17:17:19.988667792 +0000 UTC m=+0.076030213 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 17 17:17:19 compute-0 systemd[1]: 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75-1556c0f06f36ea4f.service: Main process exited, code=exited, status=1/FAILURE
Feb 17 17:17:19 compute-0 systemd[1]: 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75-1556c0f06f36ea4f.service: Failed with result 'exit-code'.
Feb 17 17:17:19 compute-0 systemd[1]: Reloading.
Feb 17 17:17:20 compute-0 systemd-sysv-generator[96656]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:17:20 compute-0 systemd-rc-local-generator[96651]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:17:20 compute-0 systemd[96608]: Queued start job for default target Main User Target.
Feb 17 17:17:20 compute-0 systemd[96608]: Created slice User Application Slice.
Feb 17 17:17:20 compute-0 systemd[96608]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 17 17:17:20 compute-0 systemd[96608]: Started Daily Cleanup of User's Temporary Directories.
Feb 17 17:17:20 compute-0 systemd[96608]: Reached target Paths.
Feb 17 17:17:20 compute-0 systemd[96608]: Reached target Timers.
Feb 17 17:17:20 compute-0 systemd[96608]: Starting D-Bus User Message Bus Socket...
Feb 17 17:17:20 compute-0 systemd[96608]: Starting Create User's Volatile Files and Directories...
Feb 17 17:17:20 compute-0 systemd[96608]: Finished Create User's Volatile Files and Directories.
Feb 17 17:17:20 compute-0 systemd[96608]: Listening on D-Bus User Message Bus Socket.
Feb 17 17:17:20 compute-0 systemd[96608]: Reached target Sockets.
Feb 17 17:17:20 compute-0 systemd[96608]: Reached target Basic System.
Feb 17 17:17:20 compute-0 systemd[96608]: Reached target Main User Target.
Feb 17 17:17:20 compute-0 systemd[96608]: Startup finished in 107ms.
Feb 17 17:17:20 compute-0 systemd[1]: Started User Manager for UID 0.
Feb 17 17:17:20 compute-0 systemd[1]: Started ovn_controller container.
Feb 17 17:17:20 compute-0 systemd[1]: Started Session c1 of User root.
Feb 17 17:17:20 compute-0 sudo[96502]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:20 compute-0 ovn_controller[96568]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 17 17:17:20 compute-0 ovn_controller[96568]: INFO:__main__:Validating config file
Feb 17 17:17:20 compute-0 ovn_controller[96568]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 17 17:17:20 compute-0 ovn_controller[96568]: INFO:__main__:Writing out command to execute
Feb 17 17:17:20 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Feb 17 17:17:20 compute-0 ovn_controller[96568]: ++ cat /run_command
Feb 17 17:17:20 compute-0 ovn_controller[96568]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Feb 17 17:17:20 compute-0 ovn_controller[96568]: + ARGS=
Feb 17 17:17:20 compute-0 ovn_controller[96568]: + sudo kolla_copy_cacerts
Feb 17 17:17:20 compute-0 systemd[1]: Started Session c2 of User root.
Feb 17 17:17:20 compute-0 ovn_controller[96568]: + [[ ! -n '' ]]
Feb 17 17:17:20 compute-0 ovn_controller[96568]: + . kolla_extend_start
Feb 17 17:17:20 compute-0 ovn_controller[96568]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Feb 17 17:17:20 compute-0 ovn_controller[96568]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Feb 17 17:17:20 compute-0 ovn_controller[96568]: + umask 0022
Feb 17 17:17:20 compute-0 ovn_controller[96568]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Feb 17 17:17:20 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Feb 17 17:17:20 compute-0 NetworkManager[56323]: <info>  [1771348640.3515] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Feb 17 17:17:20 compute-0 NetworkManager[56323]: <info>  [1771348640.3522] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 17 17:17:20 compute-0 NetworkManager[56323]: <warn>  [1771348640.3524] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 17 17:17:20 compute-0 NetworkManager[56323]: <info>  [1771348640.3536] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Feb 17 17:17:20 compute-0 NetworkManager[56323]: <info>  [1771348640.3548] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Feb 17 17:17:20 compute-0 NetworkManager[56323]: <info>  [1771348640.3555] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Feb 17 17:17:20 compute-0 kernel: br-int: entered promiscuous mode
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00010|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00011|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00012|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00013|features|INFO|OVS Feature: ct_zero_snat, state: supported
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00014|features|INFO|OVS Feature: ct_flush, state: supported
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00015|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00016|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00017|main|INFO|OVS feature set changed, force recompute.
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00018|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00019|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00021|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00022|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00023|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00024|main|INFO|OVS feature set changed, force recompute.
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 17 17:17:20 compute-0 ovn_controller[96568]: 2026-02-17T17:17:20Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 17 17:17:20 compute-0 NetworkManager[56323]: <info>  [1771348640.3700] manager: (ovn-11f143-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Feb 17 17:17:20 compute-0 systemd-udevd[96720]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:17:20 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Feb 17 17:17:20 compute-0 NetworkManager[56323]: <info>  [1771348640.3872] device (genev_sys_6081): carrier: link connected
Feb 17 17:17:20 compute-0 NetworkManager[56323]: <info>  [1771348640.3875] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Feb 17 17:17:20 compute-0 systemd-udevd[96737]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:17:20 compute-0 python3.9[96841]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 17 17:17:21 compute-0 sudo[96991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpvmjzgcczncrkjbytsolxrpkhkzvadv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348641.3144355-620-192370635624382/AnsiballZ_stat.py'
Feb 17 17:17:21 compute-0 sudo[96991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:21 compute-0 python3.9[96994]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:17:21 compute-0 sudo[96991]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:21 compute-0 sudo[97115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xosmnimbfoysohtzthtxgbwahjiivtei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348641.3144355-620-192370635624382/AnsiballZ_copy.py'
Feb 17 17:17:21 compute-0 sudo[97115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:22 compute-0 python3.9[97118]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771348641.3144355-620-192370635624382/.source.yaml _original_basename=.qvnz0rpx follow=False checksum=858b0cb9a23aa14c7c9a1ed87dd6172c24a0ff7d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:17:22 compute-0 sudo[97115]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:22 compute-0 sudo[97268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpmiheeoubwfsjdjsmfxqlxlpznubffg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348642.4069054-635-220508388237505/AnsiballZ_command.py'
Feb 17 17:17:22 compute-0 sudo[97268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:22 compute-0 python3.9[97271]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:17:22 compute-0 ovs-vsctl[97272]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Feb 17 17:17:22 compute-0 sudo[97268]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:23 compute-0 sudo[97422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfzsrmflwfqbyjjsvpkukiriwpxxzcje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348642.917941-643-196757364925150/AnsiballZ_command.py'
Feb 17 17:17:23 compute-0 sudo[97422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:23 compute-0 python3.9[97425]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:17:23 compute-0 ovs-vsctl[97427]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Feb 17 17:17:23 compute-0 sudo[97422]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:23 compute-0 sudo[97578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rddegjnrgrshxmlidoqhfwubbbjyfijp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348643.670145-657-35351260039274/AnsiballZ_command.py'
Feb 17 17:17:23 compute-0 sudo[97578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:24 compute-0 python3.9[97581]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:17:24 compute-0 ovs-vsctl[97582]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Feb 17 17:17:24 compute-0 sudo[97578]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:24 compute-0 sshd-session[85991]: Connection closed by 192.168.122.30 port 46028
Feb 17 17:17:24 compute-0 sshd-session[85988]: pam_unix(sshd:session): session closed for user zuul
Feb 17 17:17:24 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Feb 17 17:17:24 compute-0 systemd-logind[806]: Session 19 logged out. Waiting for processes to exit.
Feb 17 17:17:24 compute-0 systemd[1]: session-19.scope: Consumed 40.807s CPU time.
Feb 17 17:17:24 compute-0 systemd-logind[806]: Removed session 19.
Feb 17 17:17:29 compute-0 sshd-session[97607]: Accepted publickey for zuul from 192.168.122.30 port 47254 ssh2: ECDSA SHA256:+U7vSwQZvFvLlKGEfHWMlLBUV1LwgLTIxJVbiTi7h3A
Feb 17 17:17:29 compute-0 systemd-logind[806]: New session 21 of user zuul.
Feb 17 17:17:29 compute-0 systemd[1]: Started Session 21 of User zuul.
Feb 17 17:17:29 compute-0 sshd-session[97607]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 17:17:30 compute-0 python3.9[97760]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:17:30 compute-0 systemd[1]: Stopping User Manager for UID 0...
Feb 17 17:17:30 compute-0 systemd[96608]: Activating special unit Exit the Session...
Feb 17 17:17:30 compute-0 systemd[96608]: Stopped target Main User Target.
Feb 17 17:17:30 compute-0 systemd[96608]: Stopped target Basic System.
Feb 17 17:17:30 compute-0 systemd[96608]: Stopped target Paths.
Feb 17 17:17:30 compute-0 systemd[96608]: Stopped target Sockets.
Feb 17 17:17:30 compute-0 systemd[96608]: Stopped target Timers.
Feb 17 17:17:30 compute-0 systemd[96608]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 17 17:17:30 compute-0 systemd[96608]: Closed D-Bus User Message Bus Socket.
Feb 17 17:17:30 compute-0 systemd[96608]: Stopped Create User's Volatile Files and Directories.
Feb 17 17:17:30 compute-0 systemd[96608]: Removed slice User Application Slice.
Feb 17 17:17:30 compute-0 systemd[96608]: Reached target Shutdown.
Feb 17 17:17:30 compute-0 systemd[96608]: Finished Exit the Session.
Feb 17 17:17:30 compute-0 systemd[96608]: Reached target Exit the Session.
Feb 17 17:17:30 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Feb 17 17:17:30 compute-0 systemd[1]: Stopped User Manager for UID 0.
Feb 17 17:17:30 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 17 17:17:30 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 17 17:17:30 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 17 17:17:30 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 17 17:17:30 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Feb 17 17:17:30 compute-0 sudo[97916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zibrrahkgnirzkbhrygnqnkussnoswvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348650.395417-29-267713971610874/AnsiballZ_file.py'
Feb 17 17:17:30 compute-0 sudo[97916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:31 compute-0 python3.9[97919]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:17:31 compute-0 sudo[97916]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:31 compute-0 sudo[98069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhrtvstpohnjbycsjkdbmhbzxofrmpin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348651.1529813-29-235297483647502/AnsiballZ_file.py'
Feb 17 17:17:31 compute-0 sudo[98069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:31 compute-0 python3.9[98072]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:17:31 compute-0 sudo[98069]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:32 compute-0 sudo[98222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aadhdigmxmgnjxmdtbtpzvdovqckvmpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348651.7953243-29-269980180707714/AnsiballZ_file.py'
Feb 17 17:17:32 compute-0 sudo[98222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:32 compute-0 python3.9[98225]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:17:32 compute-0 sudo[98222]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:32 compute-0 sudo[98375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zauvrimlqmiytbnjfqmfouzreeghsuyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348652.3323903-29-176560154020634/AnsiballZ_file.py'
Feb 17 17:17:32 compute-0 sudo[98375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:32 compute-0 python3.9[98378]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:17:32 compute-0 sudo[98375]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:33 compute-0 sudo[98528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yylhqqvtayagebtddrfpbayrtedaujvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348652.901782-29-91980718765969/AnsiballZ_file.py'
Feb 17 17:17:33 compute-0 sudo[98528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:33 compute-0 python3.9[98531]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:17:33 compute-0 sudo[98528]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:33 compute-0 python3.9[98681]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:17:34 compute-0 sudo[98831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wthfywwijsoimuyxuvkbngjfarjxtssr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348654.153218-73-80499507126672/AnsiballZ_seboolean.py'
Feb 17 17:17:34 compute-0 sudo[98831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:34 compute-0 python3.9[98834]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 17 17:17:35 compute-0 sudo[98831]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:35 compute-0 python3.9[98984]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:17:36 compute-0 python3.9[99105]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771348655.4350355-81-174053576552928/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:17:37 compute-0 python3.9[99255]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:17:37 compute-0 python3.9[99376]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771348657.0087337-96-105238946498722/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:17:38 compute-0 sudo[99526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzzeeqizgzhvplokzfznftaxqdavebih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348658.0902293-113-159127667651521/AnsiballZ_setup.py'
Feb 17 17:17:38 compute-0 sudo[99526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:38 compute-0 python3.9[99529]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 17 17:17:38 compute-0 sudo[99526]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:39 compute-0 sudo[99612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zffkkxvgxtguwfqgygxxmtaxywdailug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348658.0902293-113-159127667651521/AnsiballZ_dnf.py'
Feb 17 17:17:39 compute-0 sudo[99612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:39 compute-0 python3.9[99615]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 17 17:17:40 compute-0 sudo[99612]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:41 compute-0 sudo[99766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-injytjhkgmqiprherxpqyiqhcbaqhsyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348660.897442-125-233495718565973/AnsiballZ_systemd.py'
Feb 17 17:17:41 compute-0 sudo[99766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:41 compute-0 python3.9[99769]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 17 17:17:41 compute-0 sudo[99766]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:42 compute-0 python3.9[99922]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:17:43 compute-0 python3.9[100043]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771348662.129123-133-48701436777124/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:17:43 compute-0 python3.9[100193]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:17:44 compute-0 python3.9[100314]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771348663.1790674-133-175713315937415/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:17:45 compute-0 python3.9[100464]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:17:45 compute-0 python3.9[100585]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771348664.7343585-177-151854126161407/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:17:46 compute-0 python3.9[100735]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:17:46 compute-0 python3.9[100856]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771348665.7380934-177-100206723126297/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:17:47 compute-0 python3.9[101006]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:17:47 compute-0 sudo[101158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqapimdmkpsiddpzcjhsihdfpbxqtkye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348667.251066-215-3678871832342/AnsiballZ_file.py'
Feb 17 17:17:47 compute-0 sudo[101158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:47 compute-0 python3.9[101161]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:17:47 compute-0 sudo[101158]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:48 compute-0 sudo[101311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phnzfuzbyltoopbkdttckkcpiozqmahe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348667.8731933-223-14609138906600/AnsiballZ_stat.py'
Feb 17 17:17:48 compute-0 sudo[101311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:48 compute-0 python3.9[101314]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:17:48 compute-0 sudo[101311]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:48 compute-0 sudo[101390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igjvnnurkxuegmsmkxlmoemdugtrjdsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348667.8731933-223-14609138906600/AnsiballZ_file.py'
Feb 17 17:17:48 compute-0 sudo[101390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:48 compute-0 python3.9[101393]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:17:48 compute-0 sudo[101390]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:49 compute-0 sudo[101543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rofcadykmidphrcchnjfbohscyhgeqsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348668.8204634-223-240955990273211/AnsiballZ_stat.py'
Feb 17 17:17:49 compute-0 sudo[101543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:49 compute-0 python3.9[101546]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:17:49 compute-0 sudo[101543]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:49 compute-0 sudo[101622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfsrxicyyqwteiojpfqiruyfgasavdxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348668.8204634-223-240955990273211/AnsiballZ_file.py'
Feb 17 17:17:49 compute-0 sudo[101622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:49 compute-0 python3.9[101625]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:17:49 compute-0 sudo[101622]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:50 compute-0 sudo[101781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vahdrxdnybutxoszsawztvdtujscilje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348669.8232765-246-154584505686994/AnsiballZ_file.py'
Feb 17 17:17:50 compute-0 sudo[101781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:50 compute-0 ovn_controller[96568]: 2026-02-17T17:17:50Z|00025|memory|INFO|16000 kB peak resident set size after 29.8 seconds
Feb 17 17:17:50 compute-0 ovn_controller[96568]: 2026-02-17T17:17:50Z|00026|memory|INFO|idl-cells-OVN_Southbound:239 idl-cells-Open_vSwitch:471 ofctrl_desired_flow_usage-KB:5 ofctrl_installed_flow_usage-KB:4 ofctrl_sb_flow_ref_usage-KB:2
Feb 17 17:17:50 compute-0 podman[101749]: 2026-02-17 17:17:50.100678842 +0000 UTC m=+0.072350175 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 17 17:17:50 compute-0 python3.9[101789]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:17:50 compute-0 sudo[101781]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:50 compute-0 sudo[101954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uavlpvystxtgarakmtmcibyjbjuouojb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348670.3658075-254-30063321276822/AnsiballZ_stat.py'
Feb 17 17:17:50 compute-0 sudo[101954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:50 compute-0 python3.9[101957]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:17:50 compute-0 sudo[101954]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:50 compute-0 sudo[102033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cexpbitpogsxjffkwslbsnfnywftrgqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348670.3658075-254-30063321276822/AnsiballZ_file.py'
Feb 17 17:17:50 compute-0 sudo[102033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:51 compute-0 python3.9[102036]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:17:51 compute-0 sudo[102033]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:51 compute-0 sudo[102186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjvpfpjqaenqzgavsgyarhwtaurgsgzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348671.4087756-266-273751412011962/AnsiballZ_stat.py'
Feb 17 17:17:51 compute-0 sudo[102186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:51 compute-0 python3.9[102189]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:17:51 compute-0 sudo[102186]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:52 compute-0 sudo[102265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpprnipeeyhufzlsxmykbmjqiyscfnjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348671.4087756-266-273751412011962/AnsiballZ_file.py'
Feb 17 17:17:52 compute-0 sudo[102265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:52 compute-0 python3.9[102268]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:17:52 compute-0 sudo[102265]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:52 compute-0 sudo[102418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyreoaimxmuvqfdcwpwedrdcfymeldwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348672.4249148-278-64197347320713/AnsiballZ_systemd.py'
Feb 17 17:17:52 compute-0 sudo[102418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:52 compute-0 python3.9[102421]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:17:52 compute-0 systemd[1]: Reloading.
Feb 17 17:17:53 compute-0 systemd-sysv-generator[102449]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:17:53 compute-0 systemd-rc-local-generator[102443]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:17:53 compute-0 sudo[102418]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:53 compute-0 sudo[102614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvnwtwjiqpovkjvktdmwcsxpmlfzjijy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348673.3502815-286-281050882184148/AnsiballZ_stat.py'
Feb 17 17:17:53 compute-0 sudo[102614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:53 compute-0 python3.9[102617]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:17:53 compute-0 sudo[102614]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:54 compute-0 sudo[102693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdpgiuolmhpqzntpplctctuzjehnnxle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348673.3502815-286-281050882184148/AnsiballZ_file.py'
Feb 17 17:17:54 compute-0 sudo[102693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:54 compute-0 python3.9[102696]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:17:54 compute-0 sudo[102693]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:54 compute-0 sudo[102846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwaeqppamezjpietdbpawtyygbickvnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348674.3238456-298-234222089977004/AnsiballZ_stat.py'
Feb 17 17:17:54 compute-0 sudo[102846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:54 compute-0 python3.9[102849]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:17:54 compute-0 sudo[102846]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:54 compute-0 sudo[102925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrlwaxhbstazvifpjppgonvhvqnvstnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348674.3238456-298-234222089977004/AnsiballZ_file.py'
Feb 17 17:17:54 compute-0 sudo[102925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:55 compute-0 python3.9[102928]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:17:55 compute-0 sudo[102925]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:55 compute-0 sudo[103078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tshtkyqfndgkchkarxkglhywsbpkwvft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348675.1992512-310-87166114556419/AnsiballZ_systemd.py'
Feb 17 17:17:55 compute-0 sudo[103078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:55 compute-0 python3.9[103081]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:17:55 compute-0 systemd[1]: Reloading.
Feb 17 17:17:55 compute-0 systemd-rc-local-generator[103106]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:17:55 compute-0 systemd-sysv-generator[103111]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:17:55 compute-0 systemd[1]: Starting Create netns directory...
Feb 17 17:17:55 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 17 17:17:55 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 17 17:17:55 compute-0 systemd[1]: Finished Create netns directory.
Feb 17 17:17:55 compute-0 sudo[103078]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:56 compute-0 sudo[103278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfitofgsskodgzosxhcmbksinnxrnnkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348676.2565331-320-38735655920530/AnsiballZ_file.py'
Feb 17 17:17:56 compute-0 sudo[103278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:56 compute-0 python3.9[103281]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:17:56 compute-0 sudo[103278]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:57 compute-0 sudo[103431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyzgpljsvdcjjemmtfnismdpseisqmvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348676.8079722-328-239163168901020/AnsiballZ_stat.py'
Feb 17 17:17:57 compute-0 sudo[103431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:57 compute-0 python3.9[103434]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:17:57 compute-0 sudo[103431]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:57 compute-0 sudo[103555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzlosjgzprjqbpnvuuknimlorflfxhzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348676.8079722-328-239163168901020/AnsiballZ_copy.py'
Feb 17 17:17:57 compute-0 sudo[103555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:57 compute-0 python3.9[103558]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771348676.8079722-328-239163168901020/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:17:57 compute-0 sudo[103555]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:58 compute-0 sudo[103708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbiusptehjflikbgrqwvyhaietugtbzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348678.07115-345-148119882296994/AnsiballZ_file.py'
Feb 17 17:17:58 compute-0 sudo[103708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:58 compute-0 python3.9[103711]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:17:58 compute-0 sudo[103708]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:58 compute-0 sudo[103861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktomgutevbuhthzbsvvysniuknyzqhye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348678.653044-353-237955153570725/AnsiballZ_file.py'
Feb 17 17:17:58 compute-0 sudo[103861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:59 compute-0 python3.9[103864]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:17:59 compute-0 sudo[103861]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:59 compute-0 sudo[104014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpbqdfnsojikkddnjldlblptfllpjnws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348679.2337558-361-121090847450802/AnsiballZ_stat.py'
Feb 17 17:17:59 compute-0 sudo[104014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:17:59 compute-0 python3.9[104017]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:17:59 compute-0 sudo[104014]: pam_unix(sudo:session): session closed for user root
Feb 17 17:17:59 compute-0 sudo[104138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwgqattnyhaagjkpmluujxqzremdbiem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348679.2337558-361-121090847450802/AnsiballZ_copy.py'
Feb 17 17:17:59 compute-0 sudo[104138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:00 compute-0 python3.9[104141]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771348679.2337558-361-121090847450802/.source.json _original_basename=.iw70dnoi follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:18:00 compute-0 sudo[104138]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:00 compute-0 python3.9[104291]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:18:02 compute-0 sudo[104712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbsmecesmxfqubdxiwkgtrxkhjrlhufb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348682.1372323-401-140246167518437/AnsiballZ_container_config_data.py'
Feb 17 17:18:02 compute-0 sudo[104712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:02 compute-0 python3.9[104715]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Feb 17 17:18:02 compute-0 sudo[104712]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:03 compute-0 sudo[104865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqvrcwerulxjeuxovbxuzvppevpimpsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348682.977824-412-16715970219933/AnsiballZ_container_config_hash.py'
Feb 17 17:18:03 compute-0 sudo[104865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:03 compute-0 python3.9[104868]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 17 17:18:03 compute-0 sudo[104865]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:04 compute-0 sudo[105018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enucojwygcdbswzaxxznwifpipkecjfx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771348684.1500053-422-162392543332960/AnsiballZ_edpm_container_manage.py'
Feb 17 17:18:04 compute-0 sudo[105018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:04 compute-0 python3[105021]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Feb 17 17:18:04 compute-0 podman[105057]: 2026-02-17 17:18:04.97120377 +0000 UTC m=+0.062101211 container create 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Feb 17 17:18:04 compute-0 podman[105057]: 2026-02-17 17:18:04.934579597 +0000 UTC m=+0.025477058 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 17 17:18:04 compute-0 python3[105021]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 17 17:18:05 compute-0 sudo[105018]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:05 compute-0 sudo[105245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgbhnxhdliksuboovxzynxugyxwgtjzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348685.2165165-430-168886393696017/AnsiballZ_stat.py'
Feb 17 17:18:05 compute-0 sudo[105245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:05 compute-0 python3.9[105248]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:18:05 compute-0 sudo[105245]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:06 compute-0 sudo[105400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhfhyvmekzvllvvywdwlcnkpxyuxvbar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348685.9563038-439-239821282488141/AnsiballZ_file.py'
Feb 17 17:18:06 compute-0 sudo[105400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:06 compute-0 python3.9[105403]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:18:06 compute-0 sudo[105400]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:06 compute-0 sudo[105477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryihjutqdeeashttsmrlfxdgfhfdsgbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348685.9563038-439-239821282488141/AnsiballZ_stat.py'
Feb 17 17:18:06 compute-0 sudo[105477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:06 compute-0 python3.9[105480]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:18:06 compute-0 sudo[105477]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:07 compute-0 sudo[105629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-futjsnkdgrngbaocbitkgmkfmgsrxuuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348686.874272-439-36293187589698/AnsiballZ_copy.py'
Feb 17 17:18:07 compute-0 sudo[105629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:07 compute-0 python3.9[105632]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771348686.874272-439-36293187589698/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:18:07 compute-0 sudo[105629]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:07 compute-0 sudo[105706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmivjblbjytxfcvtsuhzxehdijvvtode ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348686.874272-439-36293187589698/AnsiballZ_systemd.py'
Feb 17 17:18:07 compute-0 sudo[105706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:08 compute-0 python3.9[105709]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 17 17:18:08 compute-0 systemd[1]: Reloading.
Feb 17 17:18:08 compute-0 systemd-sysv-generator[105739]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:18:08 compute-0 systemd-rc-local-generator[105734]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:18:08 compute-0 sudo[105706]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:08 compute-0 sudo[105825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmkiwpcwulouwawyterbjnaxrsxyrwmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348686.874272-439-36293187589698/AnsiballZ_systemd.py'
Feb 17 17:18:08 compute-0 sudo[105825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:08 compute-0 python3.9[105828]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:18:08 compute-0 systemd[1]: Reloading.
Feb 17 17:18:08 compute-0 systemd-rc-local-generator[105856]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:18:08 compute-0 systemd-sysv-generator[105861]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:18:09 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Feb 17 17:18:09 compute-0 systemd[1]: Started libcrun container.
Feb 17 17:18:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b841ea5093058271a1bbe381402b146ad69c4072b6d89fb8074621c8c63a0b15/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 17 17:18:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b841ea5093058271a1bbe381402b146ad69c4072b6d89fb8074621c8c63a0b15/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 17 17:18:09 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998.
Feb 17 17:18:09 compute-0 podman[105877]: 2026-02-17 17:18:09.172355531 +0000 UTC m=+0.110381713 container init 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent)
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: + sudo -E kolla_set_configs
Feb 17 17:18:09 compute-0 podman[105877]: 2026-02-17 17:18:09.195286262 +0000 UTC m=+0.133312454 container start 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 17 17:18:09 compute-0 edpm-start-podman-container[105877]: ovn_metadata_agent
Feb 17 17:18:09 compute-0 edpm-start-podman-container[105876]: Creating additional drop-in dependency for "ovn_metadata_agent" (2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998)
Feb 17 17:18:09 compute-0 podman[105899]: 2026-02-17 17:18:09.250103594 +0000 UTC m=+0.044311652 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: INFO:__main__:Validating config file
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: INFO:__main__:Copying service configuration files
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: INFO:__main__:Writing out command to execute
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 17 17:18:09 compute-0 systemd[1]: Reloading.
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: ++ cat /run_command
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: + CMD=neutron-ovn-metadata-agent
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: + ARGS=
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: + sudo kolla_copy_cacerts
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: + [[ ! -n '' ]]
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: + . kolla_extend_start
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: Running command: 'neutron-ovn-metadata-agent'
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: + umask 0022
Feb 17 17:18:09 compute-0 ovn_metadata_agent[105893]: + exec neutron-ovn-metadata-agent
Feb 17 17:18:09 compute-0 systemd-rc-local-generator[105961]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:18:09 compute-0 systemd-sysv-generator[105967]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:18:09 compute-0 systemd[1]: Started ovn_metadata_agent container.
Feb 17 17:18:09 compute-0 sudo[105825]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:10 compute-0 python3.9[106139]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 17 17:18:10 compute-0 sudo[106289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ascqcfxlwouczixiwgvhuitvgcemxaqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348690.574975-484-278901223474478/AnsiballZ_stat.py'
Feb 17 17:18:10 compute-0 sudo[106289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.890 105898 INFO neutron.common.config [-] Logging enabled!
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.890 105898 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev44
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.890 105898 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.891 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.891 105898 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.891 105898 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.891 105898 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.891 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.891 105898 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.892 105898 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.892 105898 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.892 105898 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.892 105898 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.892 105898 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.892 105898 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.892 105898 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.892 105898 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.892 105898 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.892 105898 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.893 105898 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.893 105898 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.893 105898 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.893 105898 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.893 105898 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.893 105898 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.893 105898 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.893 105898 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.893 105898 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.894 105898 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.894 105898 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.894 105898 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.894 105898 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.894 105898 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.894 105898 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.895 105898 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.895 105898 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.895 105898 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.895 105898 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.895 105898 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.895 105898 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.896 105898 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.896 105898 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.896 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.896 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.896 105898 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.896 105898 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.896 105898 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.897 105898 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.897 105898 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.897 105898 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.897 105898 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.897 105898 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.897 105898 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.897 105898 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.897 105898 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.897 105898 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.897 105898 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.898 105898 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.898 105898 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.898 105898 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.898 105898 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.898 105898 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.898 105898 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.898 105898 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.898 105898 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.898 105898 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.899 105898 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.899 105898 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.899 105898 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.899 105898 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.899 105898 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.899 105898 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.899 105898 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.900 105898 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.900 105898 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.900 105898 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.900 105898 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.900 105898 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.900 105898 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.900 105898 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.901 105898 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.901 105898 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.901 105898 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.901 105898 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.901 105898 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.901 105898 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.901 105898 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.901 105898 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.901 105898 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.902 105898 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.902 105898 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.902 105898 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.902 105898 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.902 105898 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.902 105898 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.902 105898 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.902 105898 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.902 105898 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.903 105898 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.903 105898 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.903 105898 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.903 105898 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.903 105898 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.903 105898 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.903 105898 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.903 105898 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.903 105898 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.903 105898 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.904 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.904 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.904 105898 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.904 105898 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.904 105898 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.904 105898 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.904 105898 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.905 105898 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.905 105898 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.905 105898 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.905 105898 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.905 105898 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.905 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.905 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.905 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.905 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.906 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.906 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.906 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.906 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.906 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.906 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.906 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.906 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.906 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.907 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.907 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.907 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.907 105898 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.907 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.907 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.907 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.907 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.907 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.907 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.908 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.908 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.908 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.908 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.908 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.908 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.908 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.908 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.909 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.909 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.909 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.909 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.909 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.909 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.909 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.909 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.909 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.910 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.910 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.910 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.910 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.910 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.910 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.910 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.910 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.910 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.911 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.911 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.911 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.911 105898 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.911 105898 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.911 105898 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.911 105898 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.911 105898 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.911 105898 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.912 105898 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.912 105898 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.912 105898 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.912 105898 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.912 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.912 105898 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.912 105898 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.912 105898 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.913 105898 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.913 105898 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.913 105898 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.913 105898 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.913 105898 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.913 105898 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.913 105898 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.913 105898 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.913 105898 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.914 105898 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.914 105898 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.914 105898 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.914 105898 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.914 105898 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.914 105898 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.914 105898 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.914 105898 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.914 105898 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.914 105898 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.915 105898 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.915 105898 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.915 105898 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.915 105898 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.915 105898 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.915 105898 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.915 105898 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.915 105898 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.915 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.916 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.916 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.916 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.916 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.916 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.916 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.916 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.916 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.916 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.916 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.917 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.917 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.917 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.917 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.917 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.917 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.917 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.917 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.917 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.917 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.918 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.918 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.918 105898 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.918 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.918 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.918 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.918 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.918 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.918 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.919 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.919 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.919 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.919 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.919 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.919 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.919 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.919 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.919 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.920 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.920 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.920 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.920 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.920 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.920 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.920 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.920 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.920 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.921 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.921 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.921 105898 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.921 105898 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.921 105898 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.921 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.921 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.921 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.922 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.922 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.922 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.922 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.922 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.922 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.922 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.923 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.923 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.923 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.923 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.923 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.923 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.924 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.924 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.924 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.924 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.924 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.924 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.924 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.925 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.925 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.925 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.925 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.925 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.925 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.926 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.926 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.926 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.926 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.926 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.926 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.926 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.926 105898 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.927 105898 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.937 105898 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.937 105898 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.937 105898 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.937 105898 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.937 105898 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.949 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name ee0cee2f-3200-4f1f-8903-57b18789347d (UUID: ee0cee2f-3200-4f1f-8903-57b18789347d) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.970 105898 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.971 105898 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.971 105898 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.971 105898 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.973 105898 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.978 105898 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.984 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'ee0cee2f-3200-4f1f-8903-57b18789347d'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], external_ids={}, name=ee0cee2f-3200-4f1f-8903-57b18789347d, nb_cfg_timestamp=1771348648372, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.984 105898 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f18a6614130>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.985 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.985 105898 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.985 105898 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.985 105898 INFO oslo_service.service [-] Starting 1 workers
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.988 105898 DEBUG oslo_service.service [-] Started child 106293 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.990 105898 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp19b4y1uq/privsep.sock']
Feb 17 17:18:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:10.991 106293 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-956731'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Feb 17 17:18:10 compute-0 python3.9[106292]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:18:11 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:11.020 106293 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 17 17:18:11 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:11.021 106293 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 17 17:18:11 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:11.021 106293 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 17 17:18:11 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:11.025 106293 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 17 17:18:11 compute-0 sudo[106289]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:11 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:11.032 106293 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 17 17:18:11 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:11.038 106293 INFO eventlet.wsgi.server [-] (106293) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Feb 17 17:18:11 compute-0 sudo[106420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-outmwsuuwdyobppwqzrgofflsyiqmdug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348690.574975-484-278901223474478/AnsiballZ_copy.py'
Feb 17 17:18:11 compute-0 sudo[106420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:11 compute-0 python3.9[106423]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771348690.574975-484-278901223474478/.source.yaml _original_basename=.kh195co5 follow=False checksum=49d91f6c9cc7112a214d20fa26a5272919dd69ed backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:18:11 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Feb 17 17:18:11 compute-0 sudo[106420]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:11 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:11.632 105898 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 17 17:18:11 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:11.632 105898 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp19b4y1uq/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 17 17:18:11 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:11.495 106424 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 17 17:18:11 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:11.499 106424 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 17 17:18:11 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:11.501 106424 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Feb 17 17:18:11 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:11.501 106424 INFO oslo.privsep.daemon [-] privsep daemon running as pid 106424
Feb 17 17:18:11 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:11.635 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[37448d2d-ee45-4c57-8c19-179da5c27d43]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:18:11 compute-0 sshd-session[97610]: Connection closed by 192.168.122.30 port 47254
Feb 17 17:18:11 compute-0 sshd-session[97607]: pam_unix(sshd:session): session closed for user zuul
Feb 17 17:18:11 compute-0 systemd[1]: session-21.scope: Deactivated successfully.
Feb 17 17:18:11 compute-0 systemd[1]: session-21.scope: Consumed 30.040s CPU time.
Feb 17 17:18:11 compute-0 systemd-logind[806]: Session 21 logged out. Waiting for processes to exit.
Feb 17 17:18:11 compute-0 systemd-logind[806]: Removed session 21.
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.106 106424 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.107 106424 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.107 106424 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.588 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[ec3cb359-d5ea-43aa-b1d5-693705fb428c]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.590 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=ee0cee2f-3200-4f1f-8903-57b18789347d, column=external_ids, values=({'neutron:ovn-metadata-id': 'fba4e450-c57a-5981-8a7a-9761aa1bd1a4'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.721 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ee0cee2f-3200-4f1f-8903-57b18789347d, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.733 105898 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.733 105898 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.733 105898 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.733 105898 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.733 105898 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.733 105898 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.734 105898 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.734 105898 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.734 105898 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.734 105898 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.734 105898 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.734 105898 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.734 105898 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.735 105898 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.735 105898 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.735 105898 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.735 105898 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.735 105898 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.735 105898 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.735 105898 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.736 105898 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.736 105898 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.736 105898 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.736 105898 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.736 105898 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.737 105898 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.737 105898 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.737 105898 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.737 105898 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.737 105898 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.737 105898 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.737 105898 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.738 105898 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.738 105898 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.738 105898 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.738 105898 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.738 105898 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.738 105898 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.738 105898 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.739 105898 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.739 105898 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.739 105898 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.739 105898 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.739 105898 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.739 105898 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.739 105898 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.740 105898 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.740 105898 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.740 105898 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.740 105898 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.740 105898 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.740 105898 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.740 105898 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.740 105898 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.741 105898 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.741 105898 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.741 105898 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.741 105898 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.741 105898 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.741 105898 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.741 105898 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.742 105898 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.742 105898 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.742 105898 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.742 105898 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.742 105898 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.742 105898 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.742 105898 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.743 105898 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.743 105898 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.743 105898 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.743 105898 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.743 105898 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.743 105898 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.743 105898 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.744 105898 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.744 105898 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.744 105898 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.744 105898 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.744 105898 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.744 105898 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.744 105898 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.745 105898 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.745 105898 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.745 105898 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.745 105898 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.745 105898 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.745 105898 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.745 105898 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.745 105898 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.746 105898 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.746 105898 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.746 105898 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.746 105898 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.746 105898 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.746 105898 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.746 105898 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.746 105898 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.747 105898 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.747 105898 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.747 105898 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.747 105898 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.747 105898 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.747 105898 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.747 105898 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.747 105898 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.748 105898 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.748 105898 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.748 105898 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.748 105898 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.748 105898 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.748 105898 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.749 105898 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.749 105898 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.749 105898 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.749 105898 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.749 105898 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.749 105898 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.749 105898 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.750 105898 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.750 105898 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.750 105898 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.750 105898 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.750 105898 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.750 105898 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.750 105898 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.751 105898 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.751 105898 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.751 105898 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.751 105898 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.751 105898 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.751 105898 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.751 105898 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.752 105898 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.752 105898 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.752 105898 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.752 105898 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.752 105898 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.752 105898 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.752 105898 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.753 105898 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.753 105898 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.753 105898 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.753 105898 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.753 105898 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.753 105898 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.753 105898 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.753 105898 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.754 105898 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.754 105898 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.754 105898 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.754 105898 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.754 105898 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.754 105898 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.754 105898 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.754 105898 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.755 105898 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.755 105898 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.755 105898 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.755 105898 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.755 105898 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.755 105898 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.755 105898 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.755 105898 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.756 105898 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.756 105898 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.756 105898 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.756 105898 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.756 105898 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.756 105898 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.756 105898 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.756 105898 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.757 105898 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.757 105898 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.757 105898 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.757 105898 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.757 105898 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.757 105898 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.757 105898 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.758 105898 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.758 105898 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.758 105898 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.758 105898 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.758 105898 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.758 105898 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.758 105898 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.759 105898 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.759 105898 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.759 105898 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.759 105898 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.759 105898 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.759 105898 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.759 105898 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.760 105898 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.760 105898 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.760 105898 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.760 105898 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.760 105898 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.760 105898 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.760 105898 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.760 105898 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.761 105898 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.761 105898 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.761 105898 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.761 105898 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.761 105898 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.761 105898 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.761 105898 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.761 105898 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.762 105898 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.762 105898 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.762 105898 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.762 105898 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.762 105898 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.762 105898 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.762 105898 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.763 105898 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.763 105898 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.763 105898 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.763 105898 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.763 105898 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.763 105898 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.763 105898 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.763 105898 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.764 105898 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.764 105898 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.764 105898 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.764 105898 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.764 105898 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.764 105898 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.764 105898 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.764 105898 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.765 105898 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.765 105898 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.765 105898 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.765 105898 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.765 105898 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.765 105898 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.765 105898 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.766 105898 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.766 105898 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.766 105898 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.766 105898 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.766 105898 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.766 105898 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.766 105898 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.766 105898 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.767 105898 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.767 105898 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.767 105898 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.767 105898 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.767 105898 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.767 105898 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.767 105898 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.768 105898 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.768 105898 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.768 105898 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.768 105898 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.768 105898 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.768 105898 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.768 105898 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.769 105898 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.769 105898 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.769 105898 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.769 105898 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.769 105898 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.769 105898 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.769 105898 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.770 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.770 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.770 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.770 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.770 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.770 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.770 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.770 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.771 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.771 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.771 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.771 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.771 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.771 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.771 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.772 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.772 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.772 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.772 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.772 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.772 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.772 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.773 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.773 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.773 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.773 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.773 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.773 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.773 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.773 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.774 105898 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.774 105898 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.774 105898 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.774 105898 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.774 105898 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:18:12 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:18:12.774 105898 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 17 17:18:16 compute-0 sshd-session[106454]: Accepted publickey for zuul from 192.168.122.30 port 36840 ssh2: ECDSA SHA256:+U7vSwQZvFvLlKGEfHWMlLBUV1LwgLTIxJVbiTi7h3A
Feb 17 17:18:16 compute-0 systemd-logind[806]: New session 22 of user zuul.
Feb 17 17:18:16 compute-0 systemd[1]: Started Session 22 of User zuul.
Feb 17 17:18:16 compute-0 sshd-session[106454]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 17:18:17 compute-0 python3.9[106607]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:18:18 compute-0 sudo[106761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odsinfemmnmwffcwanvzqbwgrbshdyyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348697.905952-29-109458090475587/AnsiballZ_command.py'
Feb 17 17:18:18 compute-0 sudo[106761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:18 compute-0 python3.9[106764]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:18:18 compute-0 sudo[106761]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:19 compute-0 sudo[106927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwwabypqwkkosllzvjlwiukrbqpooezl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348698.9018528-40-79486697744922/AnsiballZ_systemd_service.py'
Feb 17 17:18:19 compute-0 sudo[106927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:19 compute-0 python3.9[106930]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 17 17:18:19 compute-0 systemd[1]: Reloading.
Feb 17 17:18:19 compute-0 systemd-rc-local-generator[106951]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:18:19 compute-0 systemd-sysv-generator[106956]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:18:19 compute-0 sudo[106927]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:20 compute-0 podman[107096]: 2026-02-17 17:18:20.580696468 +0000 UTC m=+0.119200780 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:18:20 compute-0 python3.9[107133]: ansible-ansible.builtin.service_facts Invoked
Feb 17 17:18:20 compute-0 network[107167]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 17 17:18:20 compute-0 network[107168]: 'network-scripts' will be removed from distribution in near future.
Feb 17 17:18:20 compute-0 network[107169]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 17 17:18:23 compute-0 sudo[107429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqkumxkgdsgnufrvjflgaeijkdcqywhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348702.871092-59-136203005053612/AnsiballZ_systemd_service.py'
Feb 17 17:18:23 compute-0 sudo[107429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:23 compute-0 python3.9[107432]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:18:23 compute-0 sudo[107429]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:23 compute-0 sudo[107583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmrekkhykruijjnsxthqrosqcjxbegvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348703.5471358-59-27678382759598/AnsiballZ_systemd_service.py'
Feb 17 17:18:23 compute-0 sudo[107583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:24 compute-0 python3.9[107586]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:18:24 compute-0 sudo[107583]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:24 compute-0 sudo[107737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqhxoyehywcrpfjdqobvrzinfyckfaoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348704.1979537-59-88774346373237/AnsiballZ_systemd_service.py'
Feb 17 17:18:24 compute-0 sudo[107737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:24 compute-0 python3.9[107740]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:18:24 compute-0 sudo[107737]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:25 compute-0 sudo[107891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwhxtrgzfmqgnycyntyqfwwloswsbtrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348704.9764304-59-43137056449074/AnsiballZ_systemd_service.py'
Feb 17 17:18:25 compute-0 sudo[107891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:25 compute-0 python3.9[107894]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:18:25 compute-0 sudo[107891]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:25 compute-0 sudo[108045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwegsgzbvyfbcmgvpzynduklxotbgxpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348705.6395924-59-16407739202741/AnsiballZ_systemd_service.py'
Feb 17 17:18:25 compute-0 sudo[108045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:26 compute-0 python3.9[108048]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:18:26 compute-0 sudo[108045]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:26 compute-0 sudo[108199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lddpkteclqtvcjjdeoqvwdsbrqarsriu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348706.2692208-59-92824844188628/AnsiballZ_systemd_service.py'
Feb 17 17:18:26 compute-0 sudo[108199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:26 compute-0 python3.9[108202]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:18:26 compute-0 sudo[108199]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:27 compute-0 sudo[108353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxfbgwjrzqhgzjdroimrjantyvnpwaty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348707.129878-59-82382836427844/AnsiballZ_systemd_service.py'
Feb 17 17:18:27 compute-0 sudo[108353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:27 compute-0 python3.9[108356]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:18:27 compute-0 sudo[108353]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:28 compute-0 sudo[108507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chtljaekcfdwryzfcvkydigmistsoksm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348707.9380717-111-97525126669631/AnsiballZ_file.py'
Feb 17 17:18:28 compute-0 sudo[108507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:28 compute-0 python3.9[108510]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:18:28 compute-0 sudo[108507]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:28 compute-0 sudo[108660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcndztvrjwllgumjiqzvoncbfzqxewaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348708.6362042-111-213115503210512/AnsiballZ_file.py'
Feb 17 17:18:28 compute-0 sudo[108660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:29 compute-0 python3.9[108663]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:18:29 compute-0 sudo[108660]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:29 compute-0 sudo[108813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufqyjqfnujayfwxlhhwcfsgufrwgfcmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348709.1953075-111-150432817047914/AnsiballZ_file.py'
Feb 17 17:18:29 compute-0 sudo[108813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:29 compute-0 python3.9[108816]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:18:29 compute-0 sudo[108813]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:30 compute-0 sudo[108966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyjjfjdutqoajgegziriosdctewwjvof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348709.885265-111-146423690757097/AnsiballZ_file.py'
Feb 17 17:18:30 compute-0 sudo[108966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:30 compute-0 python3.9[108969]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:18:30 compute-0 sudo[108966]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:30 compute-0 sudo[109119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkqsgqgwshctawetalxptktlccscemyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348710.3793957-111-7935348197946/AnsiballZ_file.py'
Feb 17 17:18:30 compute-0 sudo[109119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:30 compute-0 python3.9[109122]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:18:30 compute-0 sudo[109119]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:31 compute-0 sudo[109272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snhthojmsvbgenzcoiwwgwkmgvjbhuxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348710.9038363-111-61821777376508/AnsiballZ_file.py'
Feb 17 17:18:31 compute-0 sudo[109272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:31 compute-0 python3.9[109275]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:18:31 compute-0 sudo[109272]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:31 compute-0 sudo[109427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqxqidyiwlymccvzrhcutbqyfzqhdhkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348711.4041538-111-81723809858023/AnsiballZ_file.py'
Feb 17 17:18:31 compute-0 sudo[109427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:31 compute-0 python3.9[109430]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:18:31 compute-0 sudo[109427]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:32 compute-0 sshd-session[109276]: Connection closed by authenticating user root 209.38.233.161 port 54364 [preauth]
Feb 17 17:18:32 compute-0 sudo[109580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqeipmdonbbeqimclhrklaadszvjpurr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348711.9895327-161-66538349160299/AnsiballZ_file.py'
Feb 17 17:18:32 compute-0 sudo[109580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:32 compute-0 python3.9[109583]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:18:32 compute-0 sudo[109580]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:32 compute-0 sudo[109733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fktirvfakkmivycdadagrtkazciweukk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348712.558408-161-3183626122547/AnsiballZ_file.py'
Feb 17 17:18:32 compute-0 sudo[109733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:32 compute-0 python3.9[109736]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:18:32 compute-0 sudo[109733]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:33 compute-0 sudo[109886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkaednfjxzcazdebxhupoflyryrjvtiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348713.0504284-161-115543088513828/AnsiballZ_file.py'
Feb 17 17:18:33 compute-0 sudo[109886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:33 compute-0 python3.9[109889]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:18:33 compute-0 sudo[109886]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:33 compute-0 sudo[110039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjxboxptiyxmjjgocjxicttbrdyqekoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348713.532743-161-116539008193222/AnsiballZ_file.py'
Feb 17 17:18:33 compute-0 sudo[110039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:33 compute-0 python3.9[110042]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:18:33 compute-0 sudo[110039]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:34 compute-0 sudo[110192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tidmafzcnqhkrbaukpgnloijtdqvicxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348714.0423572-161-3333014583225/AnsiballZ_file.py'
Feb 17 17:18:34 compute-0 sudo[110192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:34 compute-0 python3.9[110195]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:18:34 compute-0 sudo[110192]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:34 compute-0 sudo[110345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndnpfjbozkkxjyxunfgeyxdxsvplvmxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348714.5575998-161-106229739938670/AnsiballZ_file.py'
Feb 17 17:18:34 compute-0 sudo[110345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:34 compute-0 python3.9[110348]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:18:34 compute-0 sudo[110345]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:35 compute-0 sudo[110498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkquooftobuijeypgrdegioqciwujdll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348715.073588-161-68824052304437/AnsiballZ_file.py'
Feb 17 17:18:35 compute-0 sudo[110498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:35 compute-0 python3.9[110501]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:18:35 compute-0 sudo[110498]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:35 compute-0 sudo[110651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygvxwogchhtsmiahqioggdmfsygsxjgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348715.6719491-212-148023004963508/AnsiballZ_command.py'
Feb 17 17:18:35 compute-0 sudo[110651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:36 compute-0 python3.9[110654]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:18:36 compute-0 sudo[110651]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:36 compute-0 python3.9[110806]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 17 17:18:37 compute-0 sudo[110956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnqnnapqlzvivhtvetwwxtxbvmcudcpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348716.980041-230-185842560529415/AnsiballZ_systemd_service.py'
Feb 17 17:18:37 compute-0 sudo[110956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:37 compute-0 python3.9[110959]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 17 17:18:37 compute-0 systemd[1]: Reloading.
Feb 17 17:18:37 compute-0 systemd-rc-local-generator[110980]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:18:37 compute-0 systemd-sysv-generator[110988]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:18:37 compute-0 sudo[110956]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:38 compute-0 sudo[111152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psfmijmiaxwygygoaqikeptfuajosqah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348717.882043-238-239936532029039/AnsiballZ_command.py'
Feb 17 17:18:38 compute-0 sudo[111152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:38 compute-0 python3.9[111155]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:18:38 compute-0 sudo[111152]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:38 compute-0 sudo[111306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkwuhikhudbgypxdykqnozrwvgllcsyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348718.4604049-238-73428819227179/AnsiballZ_command.py'
Feb 17 17:18:38 compute-0 sudo[111306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:38 compute-0 python3.9[111309]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:18:38 compute-0 sudo[111306]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:39 compute-0 sudo[111460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luqhxqtbmhosnwjxnimcaempydtdaets ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348718.9751709-238-215910218592728/AnsiballZ_command.py'
Feb 17 17:18:39 compute-0 sudo[111460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:39 compute-0 python3.9[111463]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:18:39 compute-0 sudo[111460]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:39 compute-0 podman[111465]: 2026-02-17 17:18:39.47384646 +0000 UTC m=+0.061154269 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 17 17:18:39 compute-0 sudo[111634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qggglafspvhsbqhxvxydaaqinhcajspe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348719.516965-238-108667995588477/AnsiballZ_command.py'
Feb 17 17:18:39 compute-0 sudo[111634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:39 compute-0 python3.9[111637]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:18:39 compute-0 sudo[111634]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:40 compute-0 sudo[111788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylczxauglhhuqnjjctdnxkrsplnpffkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348720.0385306-238-54459932864320/AnsiballZ_command.py'
Feb 17 17:18:40 compute-0 sudo[111788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:40 compute-0 python3.9[111791]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:18:40 compute-0 sudo[111788]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:40 compute-0 sudo[111942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chwffjmtdzwihwakwxmdewyvzyllavss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348720.604988-238-175283274816729/AnsiballZ_command.py'
Feb 17 17:18:40 compute-0 sudo[111942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:41 compute-0 python3.9[111945]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:18:41 compute-0 sudo[111942]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:41 compute-0 sudo[112096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxtjwmwhrdmjebrcyzohjxmdzkzgmqmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348721.1547432-238-68220271649106/AnsiballZ_command.py'
Feb 17 17:18:41 compute-0 sudo[112096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:41 compute-0 python3.9[112099]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:18:41 compute-0 sudo[112096]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:42 compute-0 sudo[112250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpdkeiouctochhvsyyeeflqatlwfnlxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348721.9194293-292-49743741669391/AnsiballZ_getent.py'
Feb 17 17:18:42 compute-0 sudo[112250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:42 compute-0 python3.9[112253]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Feb 17 17:18:42 compute-0 sudo[112250]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:43 compute-0 sudo[112404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvuyuhreaqpwateezdkfspxixhgqscnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348722.630541-300-218038642898180/AnsiballZ_group.py'
Feb 17 17:18:43 compute-0 sudo[112404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:43 compute-0 python3.9[112407]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 17 17:18:43 compute-0 groupadd[112408]: group added to /etc/group: name=libvirt, GID=42473
Feb 17 17:18:43 compute-0 groupadd[112408]: group added to /etc/gshadow: name=libvirt
Feb 17 17:18:43 compute-0 groupadd[112408]: new group: name=libvirt, GID=42473
Feb 17 17:18:43 compute-0 sudo[112404]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:44 compute-0 sudo[112563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byuaeaxsbpbghwxxltlmcnumlquwrhnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348724.4275815-308-161230978519897/AnsiballZ_user.py'
Feb 17 17:18:44 compute-0 sudo[112563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:45 compute-0 python3.9[112566]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 17 17:18:45 compute-0 useradd[112568]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/1
Feb 17 17:18:45 compute-0 sudo[112563]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:45 compute-0 sudo[112724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqtqoonvazastfyorpzpzddfmwvfabyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348725.4052887-319-224335961710763/AnsiballZ_setup.py'
Feb 17 17:18:45 compute-0 sudo[112724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:45 compute-0 python3.9[112727]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 17 17:18:46 compute-0 sudo[112724]: pam_unix(sudo:session): session closed for user root
Feb 17 17:18:46 compute-0 sudo[112809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hamjtecdwzlpjqgwyceiuykhzcgswfhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348725.4052887-319-224335961710763/AnsiballZ_dnf.py'
Feb 17 17:18:46 compute-0 sudo[112809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:18:46 compute-0 python3.9[112812]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 17 17:18:50 compute-0 podman[112824]: 2026-02-17 17:18:50.782481483 +0000 UTC m=+0.120642967 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 17 17:19:08 compute-0 kernel: SELinux:  Converting 2767 SID table entries...
Feb 17 17:19:08 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 17 17:19:08 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 17 17:19:08 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 17 17:19:08 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 17 17:19:08 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 17 17:19:08 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 17 17:19:08 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 17 17:19:09 compute-0 dbus-broker-launch[789]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Feb 17 17:19:09 compute-0 podman[113035]: 2026-02-17 17:19:09.706799718 +0000 UTC m=+0.046489639 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 17 17:19:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:19:10.932 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:19:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:19:10.935 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:19:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:19:10.935 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:19:17 compute-0 kernel: SELinux:  Converting 2767 SID table entries...
Feb 17 17:19:17 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 17 17:19:17 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 17 17:19:17 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 17 17:19:17 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 17 17:19:17 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 17 17:19:17 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 17 17:19:17 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 17 17:19:21 compute-0 dbus-broker-launch[789]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Feb 17 17:19:21 compute-0 podman[113063]: 2026-02-17 17:19:21.740671939 +0000 UTC m=+0.069197946 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, container_name=ovn_controller, org.label-schema.license=GPLv2)
Feb 17 17:19:40 compute-0 podman[124497]: 2026-02-17 17:19:40.714701767 +0000 UTC m=+0.060515532 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 17 17:19:46 compute-0 sshd-session[129963]: Connection closed by authenticating user root 209.38.233.161 port 34420 [preauth]
Feb 17 17:19:52 compute-0 podman[130008]: 2026-02-17 17:19:52.747648179 +0000 UTC m=+0.086185234 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:19:59 compute-0 kernel: SELinux:  Converting 2768 SID table entries...
Feb 17 17:19:59 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 17 17:19:59 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 17 17:19:59 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 17 17:19:59 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 17 17:19:59 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 17 17:19:59 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 17 17:19:59 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 17 17:20:00 compute-0 groupadd[130046]: group added to /etc/group: name=dnsmasq, GID=993
Feb 17 17:20:00 compute-0 groupadd[130046]: group added to /etc/gshadow: name=dnsmasq
Feb 17 17:20:00 compute-0 groupadd[130046]: new group: name=dnsmasq, GID=993
Feb 17 17:20:00 compute-0 useradd[130053]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Feb 17 17:20:00 compute-0 dbus-broker-launch[781]: Noticed file-system modification, trigger reload.
Feb 17 17:20:00 compute-0 dbus-broker-launch[789]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Feb 17 17:20:00 compute-0 dbus-broker-launch[781]: Noticed file-system modification, trigger reload.
Feb 17 17:20:01 compute-0 groupadd[130066]: group added to /etc/group: name=clevis, GID=992
Feb 17 17:20:01 compute-0 groupadd[130066]: group added to /etc/gshadow: name=clevis
Feb 17 17:20:01 compute-0 groupadd[130066]: new group: name=clevis, GID=992
Feb 17 17:20:01 compute-0 useradd[130073]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Feb 17 17:20:01 compute-0 usermod[130083]: add 'clevis' to group 'tss'
Feb 17 17:20:01 compute-0 usermod[130083]: add 'clevis' to shadow group 'tss'
Feb 17 17:20:03 compute-0 polkitd[44382]: Reloading rules
Feb 17 17:20:03 compute-0 polkitd[44382]: Collecting garbage unconditionally...
Feb 17 17:20:03 compute-0 polkitd[44382]: Loading rules from directory /etc/polkit-1/rules.d
Feb 17 17:20:03 compute-0 polkitd[44382]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 17 17:20:03 compute-0 polkitd[44382]: Finished loading, compiling and executing 3 rules
Feb 17 17:20:03 compute-0 polkitd[44382]: Reloading rules
Feb 17 17:20:03 compute-0 polkitd[44382]: Collecting garbage unconditionally...
Feb 17 17:20:03 compute-0 polkitd[44382]: Loading rules from directory /etc/polkit-1/rules.d
Feb 17 17:20:03 compute-0 polkitd[44382]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 17 17:20:03 compute-0 polkitd[44382]: Finished loading, compiling and executing 3 rules
Feb 17 17:20:04 compute-0 groupadd[130273]: group added to /etc/group: name=ceph, GID=167
Feb 17 17:20:04 compute-0 groupadd[130273]: group added to /etc/gshadow: name=ceph
Feb 17 17:20:04 compute-0 groupadd[130273]: new group: name=ceph, GID=167
Feb 17 17:20:04 compute-0 useradd[130279]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Feb 17 17:20:07 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Feb 17 17:20:07 compute-0 sshd[1016]: Received signal 15; terminating.
Feb 17 17:20:07 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Feb 17 17:20:07 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Feb 17 17:20:07 compute-0 systemd[1]: sshd.service: Consumed 1.578s CPU time, read 32.0K from disk, written 0B to disk.
Feb 17 17:20:07 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Feb 17 17:20:07 compute-0 systemd[1]: Stopping sshd-keygen.target...
Feb 17 17:20:07 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 17 17:20:07 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 17 17:20:07 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 17 17:20:07 compute-0 systemd[1]: Reached target sshd-keygen.target.
Feb 17 17:20:07 compute-0 systemd[1]: Starting OpenSSH server daemon...
Feb 17 17:20:07 compute-0 sshd[130798]: Server listening on 0.0.0.0 port 22.
Feb 17 17:20:07 compute-0 sshd[130798]: Server listening on :: port 22.
Feb 17 17:20:07 compute-0 systemd[1]: Started OpenSSH server daemon.
Feb 17 17:20:08 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 17 17:20:08 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 17 17:20:08 compute-0 systemd[1]: Reloading.
Feb 17 17:20:08 compute-0 systemd-rc-local-generator[131050]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:20:08 compute-0 systemd-sysv-generator[131056]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:20:08 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 17 17:20:10 compute-0 sudo[112809]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:10 compute-0 podman[134800]: 2026-02-17 17:20:10.840813175 +0000 UTC m=+0.057975733 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 17 17:20:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:20:10.931 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:20:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:20:10.932 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:20:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:20:10.932 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:20:11 compute-0 sudo[136015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsjuyizdhbsiitpioxpqomtsqxedzmwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348810.8853273-331-17556337338197/AnsiballZ_systemd.py'
Feb 17 17:20:11 compute-0 sudo[136015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:11 compute-0 python3.9[136049]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 17 17:20:11 compute-0 systemd[1]: Reloading.
Feb 17 17:20:11 compute-0 systemd-rc-local-generator[136683]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:20:11 compute-0 systemd-sysv-generator[136690]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:20:12 compute-0 sudo[136015]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:12 compute-0 sudo[137560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyyhqoblyslbcznjrbufrksmbjymjmdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348812.1251295-331-128021213806172/AnsiballZ_systemd.py'
Feb 17 17:20:12 compute-0 sudo[137560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:12 compute-0 python3.9[137590]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 17 17:20:12 compute-0 systemd[1]: Reloading.
Feb 17 17:20:12 compute-0 systemd-sysv-generator[138297]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:20:12 compute-0 systemd-rc-local-generator[138292]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:20:12 compute-0 sudo[137560]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:13 compute-0 sudo[139214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hetlxujjhlfcuimyaydmiyiuyprsdfgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348813.0086782-331-108118157520176/AnsiballZ_systemd.py'
Feb 17 17:20:13 compute-0 sudo[139214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:13 compute-0 python3.9[139229]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 17 17:20:13 compute-0 systemd[1]: Reloading.
Feb 17 17:20:13 compute-0 systemd-rc-local-generator[139810]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:20:13 compute-0 systemd-sysv-generator[139814]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:20:14 compute-0 sudo[139214]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:14 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 17 17:20:14 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 17 17:20:14 compute-0 systemd[1]: man-db-cache-update.service: Consumed 6.771s CPU time.
Feb 17 17:20:14 compute-0 systemd[1]: run-r497f626d24b4497d8d7cad156a713c80.service: Deactivated successfully.
Feb 17 17:20:14 compute-0 sudo[140237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvxtsywpiepeaewguvisgvncuhxyqwbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348814.1961064-331-24784794579934/AnsiballZ_systemd.py'
Feb 17 17:20:14 compute-0 sudo[140237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:14 compute-0 python3.9[140240]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 17 17:20:14 compute-0 systemd[1]: Reloading.
Feb 17 17:20:14 compute-0 systemd-rc-local-generator[140271]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:20:14 compute-0 systemd-sysv-generator[140274]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:20:14 compute-0 sudo[140237]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:15 compute-0 sudo[140435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdrfxkluigdwdrerprxoeqckwgjxdsyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348815.1090622-360-29346640437633/AnsiballZ_systemd.py'
Feb 17 17:20:15 compute-0 sudo[140435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:15 compute-0 python3.9[140438]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 17 17:20:15 compute-0 systemd[1]: Reloading.
Feb 17 17:20:15 compute-0 systemd-sysv-generator[140473]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:20:15 compute-0 systemd-rc-local-generator[140468]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:20:15 compute-0 sudo[140435]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:16 compute-0 sudo[140633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tokihororxlgguahqataijvlzybazbed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348816.006895-360-38016844589428/AnsiballZ_systemd.py'
Feb 17 17:20:16 compute-0 sudo[140633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:16 compute-0 python3.9[140636]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 17 17:20:16 compute-0 systemd[1]: Reloading.
Feb 17 17:20:16 compute-0 systemd-rc-local-generator[140666]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:20:16 compute-0 systemd-sysv-generator[140671]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:20:16 compute-0 sudo[140633]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:17 compute-0 sudo[140833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydbuualkqcdoykflitccpthddezoynlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348817.0586216-360-209345427610139/AnsiballZ_systemd.py'
Feb 17 17:20:17 compute-0 sudo[140833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:17 compute-0 python3.9[140836]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 17 17:20:17 compute-0 systemd[1]: Reloading.
Feb 17 17:20:17 compute-0 systemd-rc-local-generator[140864]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:20:17 compute-0 systemd-sysv-generator[140867]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:20:17 compute-0 sudo[140833]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:18 compute-0 sudo[141031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbelzrziqarjuapaazkvnakrfutnrbwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348818.0712948-360-189915001112632/AnsiballZ_systemd.py'
Feb 17 17:20:18 compute-0 sudo[141031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:18 compute-0 python3.9[141034]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 17 17:20:18 compute-0 sudo[141031]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:19 compute-0 sudo[141187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isynjshouvspmbewiywnkthkgyxigdkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348818.7937376-360-239025869802000/AnsiballZ_systemd.py'
Feb 17 17:20:19 compute-0 sudo[141187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:19 compute-0 python3.9[141190]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 17 17:20:19 compute-0 systemd[1]: Reloading.
Feb 17 17:20:19 compute-0 systemd-rc-local-generator[141220]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:20:19 compute-0 systemd-sysv-generator[141225]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:20:19 compute-0 sudo[141187]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:20 compute-0 sudo[141385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfylstejbqpcncuniwgizgxuzwofphau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348819.8279657-396-87845657115232/AnsiballZ_systemd.py'
Feb 17 17:20:20 compute-0 sudo[141385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:20 compute-0 python3.9[141388]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 17 17:20:20 compute-0 systemd[1]: Reloading.
Feb 17 17:20:20 compute-0 systemd-rc-local-generator[141413]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:20:20 compute-0 systemd-sysv-generator[141416]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:20:20 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Feb 17 17:20:20 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Feb 17 17:20:20 compute-0 sudo[141385]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:21 compute-0 sudo[141585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqzygfuhlvvrvtonxqtgjmcpcjjasgsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348820.7802293-404-210456707545164/AnsiballZ_systemd.py'
Feb 17 17:20:21 compute-0 sudo[141585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:21 compute-0 python3.9[141588]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 17 17:20:21 compute-0 sudo[141585]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:21 compute-0 sudo[141741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycrpxikpryzjicmogabzfzqtuubowjzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348821.4911866-404-88229243573569/AnsiballZ_systemd.py'
Feb 17 17:20:21 compute-0 sudo[141741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:22 compute-0 python3.9[141744]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 17 17:20:22 compute-0 sudo[141741]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:22 compute-0 sudo[141897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trnnhhshmvxhvhttipkqxgvvunyrznie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348822.204658-404-138365575951513/AnsiballZ_systemd.py'
Feb 17 17:20:22 compute-0 sudo[141897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:22 compute-0 python3.9[141900]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 17 17:20:22 compute-0 sudo[141897]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:22 compute-0 podman[141904]: 2026-02-17 17:20:22.895541866 +0000 UTC m=+0.082878857 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:20:23 compute-0 sudo[142079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gebdellwonybymgydbbcmonwbyfgmfiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348822.8656888-404-143934670973018/AnsiballZ_systemd.py'
Feb 17 17:20:23 compute-0 sudo[142079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:23 compute-0 python3.9[142082]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 17 17:20:23 compute-0 sudo[142079]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:23 compute-0 sudo[142235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xapedfxcycfmpcdhxlzumxnrkacmobne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348823.576207-404-277230417150140/AnsiballZ_systemd.py'
Feb 17 17:20:23 compute-0 sudo[142235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:24 compute-0 python3.9[142238]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 17 17:20:24 compute-0 sudo[142235]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:24 compute-0 sudo[142391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xypibfoquiicjcsljepjfoxeepuzdaqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348824.2747009-404-281311428044919/AnsiballZ_systemd.py'
Feb 17 17:20:24 compute-0 sudo[142391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:24 compute-0 python3.9[142394]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 17 17:20:24 compute-0 sudo[142391]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:25 compute-0 sudo[142547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpjxywiakpxaphnmnaxbsbfwefkyrzpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348824.9622538-404-181075416861274/AnsiballZ_systemd.py'
Feb 17 17:20:25 compute-0 sudo[142547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:25 compute-0 python3.9[142550]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 17 17:20:25 compute-0 sudo[142547]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:25 compute-0 sudo[142703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orsretzacrgmzydrijmcxssaevqlondb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348825.6688356-404-5099225384427/AnsiballZ_systemd.py'
Feb 17 17:20:25 compute-0 sudo[142703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:26 compute-0 python3.9[142706]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 17 17:20:26 compute-0 sudo[142703]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:26 compute-0 sudo[142859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iinyelvfnmgjvhfbbgvdprwewizfapon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348826.3540974-404-254425636124120/AnsiballZ_systemd.py'
Feb 17 17:20:26 compute-0 sudo[142859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:26 compute-0 python3.9[142862]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 17 17:20:26 compute-0 sudo[142859]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:27 compute-0 sudo[143015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzrkvlooysjayfvmtgnotlsiqpglwijc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348827.0472019-404-96432722446712/AnsiballZ_systemd.py'
Feb 17 17:20:27 compute-0 sudo[143015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:27 compute-0 python3.9[143018]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 17 17:20:27 compute-0 sudo[143015]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:28 compute-0 sudo[143171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nesfastcnshqqaldetzpsamkmlaakjeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348827.8182979-404-103540141897093/AnsiballZ_systemd.py'
Feb 17 17:20:28 compute-0 sudo[143171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:28 compute-0 python3.9[143174]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 17 17:20:28 compute-0 sudo[143171]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:28 compute-0 sudo[143327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgiclfjzcqdgzdgqwoqrhdtgpoohjdqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348828.5319605-404-61425537723064/AnsiballZ_systemd.py'
Feb 17 17:20:28 compute-0 sudo[143327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:29 compute-0 python3.9[143330]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 17 17:20:29 compute-0 sudo[143327]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:29 compute-0 sudo[143483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnamjzmhiyblrdgafdnulgkjyldmrvxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348829.2041981-404-252463828841487/AnsiballZ_systemd.py'
Feb 17 17:20:29 compute-0 sudo[143483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:29 compute-0 python3.9[143486]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 17 17:20:29 compute-0 sudo[143483]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:30 compute-0 sudo[143639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boduqtcuqbfnxpnqcefssnxrdbsmzkcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348829.9495504-404-187776069174973/AnsiballZ_systemd.py'
Feb 17 17:20:30 compute-0 sudo[143639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:30 compute-0 python3.9[143642]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 17 17:20:30 compute-0 sudo[143639]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:31 compute-0 sudo[143795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wevayaqnlgwclyttiokdbtiuhylhpseu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348830.8711848-506-183680217048419/AnsiballZ_file.py'
Feb 17 17:20:31 compute-0 sudo[143795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:31 compute-0 python3.9[143798]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:20:31 compute-0 sudo[143795]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:31 compute-0 sudo[143948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkaypbyfhyxcjonvthefrcyapbyeinau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348831.4634194-506-249172220632981/AnsiballZ_file.py'
Feb 17 17:20:31 compute-0 sudo[143948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:31 compute-0 python3.9[143951]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:20:31 compute-0 sudo[143948]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:32 compute-0 sudo[144101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwmcjyhwlbymvtfuvmqvucjmndbpjwue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348831.968099-506-70505454765502/AnsiballZ_file.py'
Feb 17 17:20:32 compute-0 sudo[144101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:32 compute-0 python3.9[144104]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:20:32 compute-0 sudo[144101]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:32 compute-0 sudo[144254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unycwrhmdfochmdgqoqzwfpegihzizev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348832.557742-506-152863625235626/AnsiballZ_file.py'
Feb 17 17:20:32 compute-0 sudo[144254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:32 compute-0 python3.9[144257]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:20:33 compute-0 sudo[144254]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:33 compute-0 sudo[144407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ellkfiivopoftwldoffeqxziddrjxmba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348833.1326377-506-176092007878167/AnsiballZ_file.py'
Feb 17 17:20:33 compute-0 sudo[144407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:33 compute-0 python3.9[144410]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:20:33 compute-0 sudo[144407]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:33 compute-0 sudo[144560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypbctxhmbqggcxpucacrhpyltulareyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348833.678-506-271772793192240/AnsiballZ_file.py'
Feb 17 17:20:33 compute-0 sudo[144560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:34 compute-0 python3.9[144563]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:20:34 compute-0 sudo[144560]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:34 compute-0 python3.9[144713]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:20:35 compute-0 sudo[144863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wezlffrxeygjzujfagfrrvtnwmtxyppu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348834.917252-557-64597690067784/AnsiballZ_stat.py'
Feb 17 17:20:35 compute-0 sudo[144863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:35 compute-0 python3.9[144866]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:20:35 compute-0 sudo[144863]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:36 compute-0 sudo[144989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcmiskxqgneycwvjnualrhuimdvoidge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348834.917252-557-64597690067784/AnsiballZ_copy.py'
Feb 17 17:20:36 compute-0 sudo[144989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:36 compute-0 python3.9[144992]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771348834.917252-557-64597690067784/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:36 compute-0 sudo[144989]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:36 compute-0 sudo[145142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkxfbubgvadoezhfouldidrsgtzxatji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348836.391251-557-81680680352799/AnsiballZ_stat.py'
Feb 17 17:20:36 compute-0 sudo[145142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:36 compute-0 python3.9[145145]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:20:36 compute-0 sudo[145142]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:37 compute-0 sudo[145268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtdvpijqksjvqmjchgdgcavvgimjvszb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348836.391251-557-81680680352799/AnsiballZ_copy.py'
Feb 17 17:20:37 compute-0 sudo[145268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:37 compute-0 python3.9[145271]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771348836.391251-557-81680680352799/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:37 compute-0 sudo[145268]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:37 compute-0 sudo[145421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-laavnxugtozeyblnxzkdwafortedjemo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348837.369673-557-56628159402348/AnsiballZ_stat.py'
Feb 17 17:20:37 compute-0 sudo[145421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:37 compute-0 python3.9[145424]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:20:37 compute-0 sudo[145421]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:38 compute-0 sudo[145547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tigkrrbwoptnpaecdbdplhemicqfyggi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348837.369673-557-56628159402348/AnsiballZ_copy.py'
Feb 17 17:20:38 compute-0 sudo[145547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:38 compute-0 python3.9[145550]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771348837.369673-557-56628159402348/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:38 compute-0 sudo[145547]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:38 compute-0 sudo[145700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fipajoqshixdhnhqicfshnzybuhetkah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348838.3238914-557-124611338820528/AnsiballZ_stat.py'
Feb 17 17:20:38 compute-0 sudo[145700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:38 compute-0 python3.9[145703]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:20:38 compute-0 sudo[145700]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:39 compute-0 sudo[145826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wancdfgeqslcfgdmriggibiynkjmxlvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348838.3238914-557-124611338820528/AnsiballZ_copy.py'
Feb 17 17:20:39 compute-0 sudo[145826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:39 compute-0 python3.9[145829]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771348838.3238914-557-124611338820528/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:39 compute-0 sudo[145826]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:39 compute-0 sudo[145979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpzebnpgtqfdblwaooctxzpkxfmxkpxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348839.3221405-557-154993494897981/AnsiballZ_stat.py'
Feb 17 17:20:39 compute-0 sudo[145979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:39 compute-0 python3.9[145982]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:20:39 compute-0 sudo[145979]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:40 compute-0 sudo[146105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwfktjwgldgyqinjswkcjqkxebmnjvau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348839.3221405-557-154993494897981/AnsiballZ_copy.py'
Feb 17 17:20:40 compute-0 sudo[146105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:40 compute-0 python3.9[146108]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771348839.3221405-557-154993494897981/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:40 compute-0 sudo[146105]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:40 compute-0 sudo[146258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqfnxkrhueiyavyreojmoezpvkfakkag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348840.3647463-557-117355082422018/AnsiballZ_stat.py'
Feb 17 17:20:40 compute-0 sudo[146258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:40 compute-0 python3.9[146261]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:20:40 compute-0 sudo[146258]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:41 compute-0 sudo[146395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwyvtxxsmflakewtxyyvmfrooihodswy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348840.3647463-557-117355082422018/AnsiballZ_copy.py'
Feb 17 17:20:41 compute-0 sudo[146395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:41 compute-0 podman[146358]: 2026-02-17 17:20:41.13198435 +0000 UTC m=+0.061372065 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:20:41 compute-0 python3.9[146400]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771348840.3647463-557-117355082422018/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:41 compute-0 sudo[146395]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:41 compute-0 sudo[146556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spkypxwdikuwquzwnljglglqyqhbwflz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348841.4194221-557-89187288474816/AnsiballZ_stat.py'
Feb 17 17:20:41 compute-0 sudo[146556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:41 compute-0 python3.9[146559]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:20:41 compute-0 sudo[146556]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:42 compute-0 sudo[146680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-derzgariedrlpliofqdrzcnnvppzwfzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348841.4194221-557-89187288474816/AnsiballZ_copy.py'
Feb 17 17:20:42 compute-0 sudo[146680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:42 compute-0 python3.9[146683]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771348841.4194221-557-89187288474816/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:42 compute-0 sudo[146680]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:42 compute-0 sudo[146833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibyszefkyxoywxnhsovpgfdnrzdpzzya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348842.4022443-557-82447994708974/AnsiballZ_stat.py'
Feb 17 17:20:42 compute-0 sudo[146833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:42 compute-0 python3.9[146836]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:20:42 compute-0 sudo[146833]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:43 compute-0 sudo[146959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjvfqchhcsigaozwrrtjnqfexedhbkwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348842.4022443-557-82447994708974/AnsiballZ_copy.py'
Feb 17 17:20:43 compute-0 sudo[146959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:43 compute-0 python3.9[146962]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771348842.4022443-557-82447994708974/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:43 compute-0 sudo[146959]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:43 compute-0 sudo[147112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvblqbewikvnufjqbhrtvkfuvujlqlia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348843.4141216-670-230447645746220/AnsiballZ_command.py'
Feb 17 17:20:43 compute-0 sudo[147112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:43 compute-0 python3.9[147115]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Feb 17 17:20:43 compute-0 sudo[147112]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:44 compute-0 sudo[147266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-traytckglfvgvfkxizcrajtcarrorwjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348844.0096004-679-191645771938971/AnsiballZ_file.py'
Feb 17 17:20:44 compute-0 sudo[147266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:44 compute-0 python3.9[147269]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:44 compute-0 sudo[147266]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:44 compute-0 sudo[147419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrxobjodugjuszunuqfazsqbvmhfueau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348844.5941088-679-216147630743882/AnsiballZ_file.py'
Feb 17 17:20:44 compute-0 sudo[147419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:45 compute-0 python3.9[147422]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:45 compute-0 sudo[147419]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:45 compute-0 sudo[147572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejqllncfsgtgwjdlrtpdsffrjiqavhdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348845.1395497-679-244955971788466/AnsiballZ_file.py'
Feb 17 17:20:45 compute-0 sudo[147572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:45 compute-0 python3.9[147575]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:45 compute-0 sudo[147572]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:45 compute-0 sudo[147725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byalvaqvkknohixfvslutacfemfvpwry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348845.6359003-679-53096429507844/AnsiballZ_file.py'
Feb 17 17:20:45 compute-0 sudo[147725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:46 compute-0 python3.9[147728]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:46 compute-0 sudo[147725]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:46 compute-0 sudo[147878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwhmywgksrsimibtyrnmypdypxfmnhbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348846.1874893-679-100296919522238/AnsiballZ_file.py'
Feb 17 17:20:46 compute-0 sudo[147878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:46 compute-0 python3.9[147881]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:46 compute-0 sudo[147878]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:46 compute-0 sudo[148031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvqbjmlmeckkjazpnkfnvoiwncywxcnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348846.7184074-679-51505992643923/AnsiballZ_file.py'
Feb 17 17:20:46 compute-0 sudo[148031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:47 compute-0 python3.9[148034]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:47 compute-0 sudo[148031]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:47 compute-0 sudo[148184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpenvjccnitqscqjpyaubbzrryoxochl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348847.2369971-679-72970593837541/AnsiballZ_file.py'
Feb 17 17:20:47 compute-0 sudo[148184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:47 compute-0 python3.9[148187]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:47 compute-0 sudo[148184]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:47 compute-0 sudo[148337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apbfrctfzfemrrahrjncaunezhhowbzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348847.7352004-679-83129939181073/AnsiballZ_file.py'
Feb 17 17:20:47 compute-0 sudo[148337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:48 compute-0 python3.9[148340]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:48 compute-0 sudo[148337]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:48 compute-0 sudo[148490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttgjcflvxcsrsktlxeifonmvtyjzdeit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348848.24002-679-236099989930368/AnsiballZ_file.py'
Feb 17 17:20:48 compute-0 sudo[148490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:48 compute-0 python3.9[148493]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:48 compute-0 sudo[148490]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:48 compute-0 sudo[148643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vecrkdvsicmnkbwoddddhaknvjzfvujj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348848.7319705-679-247212135569144/AnsiballZ_file.py'
Feb 17 17:20:48 compute-0 sudo[148643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:49 compute-0 python3.9[148646]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:49 compute-0 sudo[148643]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:49 compute-0 sudo[148796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhmzagvydaxorgpgopkflpxnzyvrqicd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348849.2988644-679-126078282285000/AnsiballZ_file.py'
Feb 17 17:20:49 compute-0 sudo[148796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:49 compute-0 python3.9[148799]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:49 compute-0 sudo[148796]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:49 compute-0 sudo[148949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urtgjeivjqulmleysfoftqcuuvgeghqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348849.7908025-679-237049129554668/AnsiballZ_file.py'
Feb 17 17:20:49 compute-0 sudo[148949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:50 compute-0 python3.9[148952]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:50 compute-0 sudo[148949]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:50 compute-0 sudo[149102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kywqohfiwepbczpwhpnqthrzlnguvryh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348850.2881546-679-167564659713949/AnsiballZ_file.py'
Feb 17 17:20:50 compute-0 sudo[149102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:50 compute-0 python3.9[149105]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:50 compute-0 sudo[149102]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:50 compute-0 sudo[149255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpgduhwkqwltdpzvxdxecmnuoafiacde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348850.8042603-679-204761427402863/AnsiballZ_file.py'
Feb 17 17:20:50 compute-0 sudo[149255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:51 compute-0 python3.9[149258]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:51 compute-0 sudo[149255]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:51 compute-0 sudo[149408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iaiqlflwpkllyancljipzizdqsffuvlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348851.3281891-778-115566333449036/AnsiballZ_stat.py'
Feb 17 17:20:51 compute-0 sudo[149408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:51 compute-0 python3.9[149411]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:20:51 compute-0 sudo[149408]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:51 compute-0 sudo[149532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkzzonsbidjktddwdcofdraufbmmpkbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348851.3281891-778-115566333449036/AnsiballZ_copy.py'
Feb 17 17:20:51 compute-0 sudo[149532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:52 compute-0 python3.9[149535]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348851.3281891-778-115566333449036/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:52 compute-0 sudo[149532]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:52 compute-0 sudo[149685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyatohdbdefqpbyasorffevcttvarjuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348852.2936623-778-32401836867050/AnsiballZ_stat.py'
Feb 17 17:20:52 compute-0 sudo[149685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:52 compute-0 python3.9[149688]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:20:52 compute-0 sudo[149685]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:52 compute-0 sudo[149809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xubhogqvgbinfjstsmtwcfqkfzqxjxfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348852.2936623-778-32401836867050/AnsiballZ_copy.py'
Feb 17 17:20:52 compute-0 sudo[149809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:53 compute-0 podman[149811]: 2026-02-17 17:20:53.010613159 +0000 UTC m=+0.058955073 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:20:53 compute-0 python3.9[149813]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348852.2936623-778-32401836867050/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:53 compute-0 sudo[149809]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:53 compute-0 sudo[149988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsjtxhoswefumivurqhtqpkoziggxchy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348853.2520177-778-236361516223799/AnsiballZ_stat.py'
Feb 17 17:20:53 compute-0 sudo[149988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:53 compute-0 python3.9[149991]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:20:53 compute-0 sudo[149988]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:53 compute-0 sudo[150112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msjpxksbyszuvumpqpmbilmzdjrfuwxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348853.2520177-778-236361516223799/AnsiballZ_copy.py'
Feb 17 17:20:53 compute-0 sudo[150112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:54 compute-0 python3.9[150115]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348853.2520177-778-236361516223799/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:54 compute-0 sudo[150112]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:54 compute-0 sudo[150265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boxmsustbcsvwnruxptfbysdzkyjkgce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348854.2305846-778-218688421380194/AnsiballZ_stat.py'
Feb 17 17:20:54 compute-0 sudo[150265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:54 compute-0 python3.9[150268]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:20:54 compute-0 sudo[150265]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:54 compute-0 sudo[150389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajbgwtjzcgagchkzjmbbosbjpxzigenu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348854.2305846-778-218688421380194/AnsiballZ_copy.py'
Feb 17 17:20:54 compute-0 sudo[150389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:55 compute-0 python3.9[150392]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348854.2305846-778-218688421380194/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:55 compute-0 sudo[150389]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:55 compute-0 sudo[150542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnuzmxwtllrrzxllzwwocpzzxaezmjni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348855.3328974-778-74877264915198/AnsiballZ_stat.py'
Feb 17 17:20:55 compute-0 sudo[150542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:55 compute-0 python3.9[150545]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:20:55 compute-0 sudo[150542]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:56 compute-0 sudo[150666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsqqyeqgqlwaabpwtfsqeevvilzpuwfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348855.3328974-778-74877264915198/AnsiballZ_copy.py'
Feb 17 17:20:56 compute-0 sudo[150666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:56 compute-0 python3.9[150669]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348855.3328974-778-74877264915198/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:56 compute-0 sudo[150666]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:56 compute-0 sudo[150819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmpyoffhdihupchghynuwuoahvqtjyfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348856.3221216-778-169886695933624/AnsiballZ_stat.py'
Feb 17 17:20:56 compute-0 sudo[150819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:56 compute-0 python3.9[150822]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:20:56 compute-0 sudo[150819]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:56 compute-0 sudo[150943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izboucnfdkmuyobksblodedewjdswjig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348856.3221216-778-169886695933624/AnsiballZ_copy.py'
Feb 17 17:20:56 compute-0 sudo[150943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:57 compute-0 python3.9[150946]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348856.3221216-778-169886695933624/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:57 compute-0 sudo[150943]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:57 compute-0 sudo[151096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znlwtdapqspynwrhqykzofqlsyhgwesi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348857.2617786-778-32171406960307/AnsiballZ_stat.py'
Feb 17 17:20:57 compute-0 sudo[151096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:57 compute-0 python3.9[151099]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:20:57 compute-0 sudo[151096]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:57 compute-0 sudo[151220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzvalxvoydzsuorkiwttkoyvlzsunocn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348857.2617786-778-32171406960307/AnsiballZ_copy.py'
Feb 17 17:20:57 compute-0 sudo[151220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:58 compute-0 python3.9[151223]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348857.2617786-778-32171406960307/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:58 compute-0 sudo[151220]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:58 compute-0 sudo[151373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cclngudnicbcpzmrfpntfbotskjkccoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348858.2751448-778-81006076432463/AnsiballZ_stat.py'
Feb 17 17:20:58 compute-0 sudo[151373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:58 compute-0 python3.9[151376]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:20:58 compute-0 sudo[151373]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:58 compute-0 sudo[151497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnzzpxdoxljbtdamhrjzmwginhqemoyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348858.2751448-778-81006076432463/AnsiballZ_copy.py'
Feb 17 17:20:58 compute-0 sudo[151497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:59 compute-0 python3.9[151500]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348858.2751448-778-81006076432463/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:20:59 compute-0 sudo[151497]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:59 compute-0 sudo[151650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmfmnqmnqbnomijixsdnpuofsxkxzbyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348859.2443724-778-239467808179765/AnsiballZ_stat.py'
Feb 17 17:20:59 compute-0 sudo[151650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:20:59 compute-0 python3.9[151653]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:20:59 compute-0 sudo[151650]: pam_unix(sudo:session): session closed for user root
Feb 17 17:20:59 compute-0 sudo[151774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayhkrezoehjggybmstothgqezfdscstx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348859.2443724-778-239467808179765/AnsiballZ_copy.py'
Feb 17 17:20:59 compute-0 sudo[151774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:00 compute-0 python3.9[151777]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348859.2443724-778-239467808179765/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:00 compute-0 sudo[151774]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:00 compute-0 sudo[151927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvrjmovkzaeqjvjdqcsrhcbkpjdzsimr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348860.2478523-778-184750198973464/AnsiballZ_stat.py'
Feb 17 17:21:00 compute-0 sudo[151927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:00 compute-0 python3.9[151930]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:21:00 compute-0 sudo[151927]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:00 compute-0 sudo[152051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sopftbqxuvpurspwjpxushwfjyrqxuok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348860.2478523-778-184750198973464/AnsiballZ_copy.py'
Feb 17 17:21:00 compute-0 sudo[152051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:01 compute-0 python3.9[152054]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348860.2478523-778-184750198973464/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:01 compute-0 sudo[152051]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:01 compute-0 sudo[152204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sguwabsgaswozehsujpgdswjvvvxwlrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348861.2694213-778-154177011690176/AnsiballZ_stat.py'
Feb 17 17:21:01 compute-0 sudo[152204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:01 compute-0 python3.9[152207]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:21:01 compute-0 sudo[152204]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:01 compute-0 sudo[152328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vloygdryowzbbvtjtblpbenrutjdhjlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348861.2694213-778-154177011690176/AnsiballZ_copy.py'
Feb 17 17:21:01 compute-0 sudo[152328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:02 compute-0 python3.9[152332]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348861.2694213-778-154177011690176/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:02 compute-0 sudo[152328]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:02 compute-0 sudo[152482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yealwzbtiqrhxghdflcmkeshgonnacsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348862.3525333-778-125933755464626/AnsiballZ_stat.py'
Feb 17 17:21:02 compute-0 sudo[152482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:02 compute-0 python3.9[152485]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:21:02 compute-0 sudo[152482]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:03 compute-0 sudo[152607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lopsfvbaegkbxtnagzgvduyrnlaghsxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348862.3525333-778-125933755464626/AnsiballZ_copy.py'
Feb 17 17:21:03 compute-0 sudo[152607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:03 compute-0 python3.9[152610]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348862.3525333-778-125933755464626/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:03 compute-0 sudo[152607]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:03 compute-0 sshd-session[152329]: Invalid user admin from 209.38.233.161 port 46058
Feb 17 17:21:03 compute-0 sshd-session[152329]: Connection closed by invalid user admin 209.38.233.161 port 46058 [preauth]
Feb 17 17:21:03 compute-0 sudo[152760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aedrhrvnzexhpkegdhikuudbepqzrljp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348863.3865292-778-17592631864544/AnsiballZ_stat.py'
Feb 17 17:21:03 compute-0 sudo[152760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:03 compute-0 python3.9[152763]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:21:04 compute-0 sudo[152760]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:04 compute-0 sudo[152884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpjdblyovuhhrjsilzvxsjlfjzvqybsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348863.3865292-778-17592631864544/AnsiballZ_copy.py'
Feb 17 17:21:04 compute-0 sudo[152884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:04 compute-0 python3.9[152887]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348863.3865292-778-17592631864544/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:04 compute-0 sudo[152884]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:04 compute-0 sudo[153037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtkvadzcjxkpvojcubmczitavwngquks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348864.5828447-778-242642070843902/AnsiballZ_stat.py'
Feb 17 17:21:04 compute-0 sudo[153037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:04 compute-0 python3.9[153040]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:21:04 compute-0 sudo[153037]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:05 compute-0 sudo[153161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sexbqutmdpgsiurrjewmxmalxxmirgjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348864.5828447-778-242642070843902/AnsiballZ_copy.py'
Feb 17 17:21:05 compute-0 sudo[153161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:05 compute-0 python3.9[153164]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348864.5828447-778-242642070843902/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:05 compute-0 sudo[153161]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:06 compute-0 python3.9[153314]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:21:06 compute-0 sudo[153467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouoytujqvknpkutaifmwondnttfpnyqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348866.192246-984-214792255590517/AnsiballZ_seboolean.py'
Feb 17 17:21:06 compute-0 sudo[153467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:06 compute-0 python3.9[153470]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Feb 17 17:21:07 compute-0 sudo[153467]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:08 compute-0 sudo[153624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvtziyrxbvanrcwstzylptwbnujpmgci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348867.8968556-992-122883446608892/AnsiballZ_copy.py'
Feb 17 17:21:08 compute-0 dbus-broker-launch[789]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Feb 17 17:21:08 compute-0 sudo[153624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:08 compute-0 python3.9[153627]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:08 compute-0 sudo[153624]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:08 compute-0 sudo[153777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abaoxmchsuxvlfxacduldjrwajlgsafk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348868.4268746-992-46096694363008/AnsiballZ_copy.py'
Feb 17 17:21:08 compute-0 sudo[153777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:08 compute-0 python3.9[153780]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:08 compute-0 sudo[153777]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:09 compute-0 sudo[153930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jezmavnrkgzbgvxuncgxcdcqyvnqxtqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348869.0028572-992-8799488321827/AnsiballZ_copy.py'
Feb 17 17:21:09 compute-0 sudo[153930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:09 compute-0 python3.9[153933]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:09 compute-0 sudo[153930]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:09 compute-0 sudo[154083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxrkheythegpyegimwlnxrqidfsntigf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348869.4972062-992-45534852053066/AnsiballZ_copy.py'
Feb 17 17:21:09 compute-0 sudo[154083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:09 compute-0 python3.9[154086]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:09 compute-0 sudo[154083]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:10 compute-0 sudo[154236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tupzjrmzhjiuqknrhjkykbiyjgpkjjmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348870.0952184-992-191617632288845/AnsiballZ_copy.py'
Feb 17 17:21:10 compute-0 sudo[154236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:10 compute-0 python3.9[154239]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:10 compute-0 sudo[154236]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:10 compute-0 sudo[154389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwbusqikvkmhnessjxdkoyhbfwpatbhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348870.6630352-1028-123469627955377/AnsiballZ_copy.py'
Feb 17 17:21:10 compute-0 sudo[154389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:21:10.933 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:21:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:21:10.934 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:21:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:21:10.934 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:21:11 compute-0 python3.9[154392]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:11 compute-0 sudo[154389]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:11 compute-0 sudo[154555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwuetrvozxppchqzlbhgvxsinvgydszp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348871.1633008-1028-195031833039647/AnsiballZ_copy.py'
Feb 17 17:21:11 compute-0 sudo[154555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:11 compute-0 podman[154516]: 2026-02-17 17:21:11.389794421 +0000 UTC m=+0.048139560 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 17 17:21:11 compute-0 python3.9[154563]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:11 compute-0 sudo[154555]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:11 compute-0 sudo[154714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efdaivcsztsvqslzjugczevvfthmkfku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348871.6683633-1028-18663418946909/AnsiballZ_copy.py'
Feb 17 17:21:11 compute-0 sudo[154714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:12 compute-0 python3.9[154717]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:12 compute-0 sudo[154714]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:12 compute-0 sudo[154867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbnmayqvytziwrrklawzcvtuqyqjgwtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348872.2288358-1028-68737741092916/AnsiballZ_copy.py'
Feb 17 17:21:12 compute-0 sudo[154867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:12 compute-0 python3.9[154870]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:12 compute-0 sudo[154867]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:12 compute-0 sudo[155020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqczcbzrjpishdahtmbutkxfzgelygqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348872.8030636-1028-82146144279909/AnsiballZ_copy.py'
Feb 17 17:21:12 compute-0 sudo[155020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:13 compute-0 python3.9[155023]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:13 compute-0 sudo[155020]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:13 compute-0 sudo[155173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uerasdwtlhbuosyugdhfpkvonzionokg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348873.3572261-1064-220969260156676/AnsiballZ_systemd.py'
Feb 17 17:21:13 compute-0 sudo[155173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:13 compute-0 python3.9[155176]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 17 17:21:13 compute-0 systemd[1]: Reloading.
Feb 17 17:21:13 compute-0 systemd-rc-local-generator[155199]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:21:13 compute-0 systemd-sysv-generator[155204]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:21:14 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Feb 17 17:21:14 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Feb 17 17:21:14 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Feb 17 17:21:14 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Feb 17 17:21:14 compute-0 systemd[1]: Starting libvirt logging daemon...
Feb 17 17:21:14 compute-0 systemd[1]: Started libvirt logging daemon.
Feb 17 17:21:14 compute-0 sudo[155173]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:14 compute-0 sudo[155374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhelarsiaaehandlzdckfxauemvythcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348874.4355006-1064-125606537920750/AnsiballZ_systemd.py'
Feb 17 17:21:14 compute-0 sudo[155374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:15 compute-0 python3.9[155377]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 17 17:21:15 compute-0 systemd[1]: Reloading.
Feb 17 17:21:15 compute-0 systemd-rc-local-generator[155399]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:21:15 compute-0 systemd-sysv-generator[155408]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:21:15 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Feb 17 17:21:15 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Feb 17 17:21:15 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Feb 17 17:21:15 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Feb 17 17:21:15 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Feb 17 17:21:15 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Feb 17 17:21:15 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Feb 17 17:21:15 compute-0 systemd[1]: Started libvirt nodedev daemon.
Feb 17 17:21:15 compute-0 sudo[155374]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:15 compute-0 sudo[155599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpugxiowelueyqviwjxtyjcdilgvidzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348875.4514444-1064-123479103829197/AnsiballZ_systemd.py'
Feb 17 17:21:15 compute-0 sudo[155599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:15 compute-0 python3.9[155602]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 17 17:21:16 compute-0 systemd[1]: Reloading.
Feb 17 17:21:16 compute-0 systemd-rc-local-generator[155632]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:21:16 compute-0 systemd-sysv-generator[155638]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:21:16 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Feb 17 17:21:16 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Feb 17 17:21:16 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Feb 17 17:21:16 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Feb 17 17:21:16 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Feb 17 17:21:16 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 17 17:21:16 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 17 17:21:16 compute-0 sudo[155599]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:16 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Feb 17 17:21:16 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Feb 17 17:21:16 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Feb 17 17:21:16 compute-0 sudo[155825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iewquitoxbxcoufyyvdmryxfevmsybqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348876.5271676-1064-246791225581066/AnsiballZ_systemd.py'
Feb 17 17:21:16 compute-0 sudo[155825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:17 compute-0 python3.9[155828]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 17 17:21:17 compute-0 systemd[1]: Reloading.
Feb 17 17:21:17 compute-0 systemd-rc-local-generator[155859]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:21:17 compute-0 systemd-sysv-generator[155863]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:21:17 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Feb 17 17:21:17 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Feb 17 17:21:17 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Feb 17 17:21:17 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Feb 17 17:21:17 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Feb 17 17:21:17 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Feb 17 17:21:17 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Feb 17 17:21:17 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Feb 17 17:21:17 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Feb 17 17:21:17 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Feb 17 17:21:17 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Feb 17 17:21:17 compute-0 systemd[1]: Started libvirt QEMU daemon.
Feb 17 17:21:17 compute-0 sudo[155825]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:17 compute-0 sudo[156051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prafbixnbfoguddkwhlilyomcfwdcmbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348877.4909742-1064-34025179553387/AnsiballZ_systemd.py'
Feb 17 17:21:17 compute-0 sudo[156051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:17 compute-0 setroubleshoot[155645]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 9d3a5639-97f0-4162-b878-ff494edbb5cb
Feb 17 17:21:17 compute-0 setroubleshoot[155645]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Feb 17 17:21:17 compute-0 setroubleshoot[155645]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 9d3a5639-97f0-4162-b878-ff494edbb5cb
Feb 17 17:21:17 compute-0 rsyslogd[1015]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 17 17:21:17 compute-0 rsyslogd[1015]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 17 17:21:17 compute-0 setroubleshoot[155645]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Feb 17 17:21:18 compute-0 python3.9[156054]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 17 17:21:18 compute-0 systemd[1]: Reloading.
Feb 17 17:21:18 compute-0 systemd-sysv-generator[156083]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:21:18 compute-0 systemd-rc-local-generator[156080]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:21:18 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Feb 17 17:21:18 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Feb 17 17:21:18 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Feb 17 17:21:18 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Feb 17 17:21:18 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Feb 17 17:21:18 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Feb 17 17:21:18 compute-0 systemd[1]: Starting libvirt secret daemon...
Feb 17 17:21:18 compute-0 systemd[1]: Started libvirt secret daemon.
Feb 17 17:21:18 compute-0 sudo[156051]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:18 compute-0 sudo[156271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fraduibnrlzuqntdhobsyzfwrziazwph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348878.5282586-1101-27755319463532/AnsiballZ_file.py'
Feb 17 17:21:18 compute-0 sudo[156271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:18 compute-0 python3.9[156274]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:18 compute-0 sudo[156271]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:19 compute-0 sudo[156424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmdpmokbangnewxrsvfajseecpeiosrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348879.0734737-1109-25668184285497/AnsiballZ_find.py'
Feb 17 17:21:19 compute-0 sudo[156424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:19 compute-0 python3.9[156427]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 17 17:21:19 compute-0 sudo[156424]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:20 compute-0 sudo[156577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djuiebzonjvrtvdlgbzvxdadwjlddekn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348879.8525286-1123-68909675645310/AnsiballZ_stat.py'
Feb 17 17:21:20 compute-0 sudo[156577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:20 compute-0 python3.9[156580]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:21:20 compute-0 sudo[156577]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:20 compute-0 sudo[156701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sthubrdxerpogxzkjwczjfloppcuummn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348879.8525286-1123-68909675645310/AnsiballZ_copy.py'
Feb 17 17:21:20 compute-0 sudo[156701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:20 compute-0 python3.9[156704]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771348879.8525286-1123-68909675645310/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:20 compute-0 sudo[156701]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:21 compute-0 sudo[156854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdlcmxcjkvhwnhmhwocyujtviztgxyan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348881.0221374-1139-269459334079069/AnsiballZ_file.py'
Feb 17 17:21:21 compute-0 sudo[156854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:21 compute-0 python3.9[156857]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:21 compute-0 sudo[156854]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:21 compute-0 sudo[157007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owccrilbfneywektoeojcznzkjmmsxte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348881.5657432-1147-30048120472941/AnsiballZ_stat.py'
Feb 17 17:21:21 compute-0 sudo[157007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:22 compute-0 python3.9[157010]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:21:22 compute-0 sudo[157007]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:22 compute-0 sudo[157086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyxnjgcgpwbvuotpvtimkjhatkxerawl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348881.5657432-1147-30048120472941/AnsiballZ_file.py'
Feb 17 17:21:22 compute-0 sudo[157086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:22 compute-0 python3.9[157089]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:22 compute-0 sudo[157086]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:22 compute-0 sudo[157239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbmdrhrammqwqmnjrwyggfqncsmpjwrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348882.6027987-1159-279384980690351/AnsiballZ_stat.py'
Feb 17 17:21:22 compute-0 sudo[157239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:23 compute-0 python3.9[157242]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:21:23 compute-0 sudo[157239]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:23 compute-0 podman[157243]: 2026-02-17 17:21:23.177808066 +0000 UTC m=+0.090577025 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 17 17:21:23 compute-0 sudo[157344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pisbvdbptmekkqmdyyxwuqefzvgbbcrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348882.6027987-1159-279384980690351/AnsiballZ_file.py'
Feb 17 17:21:23 compute-0 sudo[157344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:23 compute-0 python3.9[157347]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.vt8z36cd recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:23 compute-0 sudo[157344]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:23 compute-0 sudo[157497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ancsaxnajmnslfjrwyfedmzgcjxzpklh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348883.609889-1171-9984002544456/AnsiballZ_stat.py'
Feb 17 17:21:23 compute-0 sudo[157497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:24 compute-0 python3.9[157500]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:21:24 compute-0 sudo[157497]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:24 compute-0 sudo[157576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnerpjsozbalkzebhyfxnhfcdoaczdex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348883.609889-1171-9984002544456/AnsiballZ_file.py'
Feb 17 17:21:24 compute-0 sudo[157576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:24 compute-0 python3.9[157579]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:24 compute-0 sudo[157576]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:24 compute-0 sudo[157729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvxvfojqpxuvapfqmsdidqzpzkbqzvcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348884.586995-1184-132146493956649/AnsiballZ_command.py'
Feb 17 17:21:24 compute-0 sudo[157729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:25 compute-0 python3.9[157732]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:21:25 compute-0 sudo[157729]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:25 compute-0 sudo[157883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kczygqnweauqxtyjomomqbkkhofocigg ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771348885.3839886-1192-188649133489563/AnsiballZ_edpm_nftables_from_files.py'
Feb 17 17:21:25 compute-0 sudo[157883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:25 compute-0 python3[157886]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 17 17:21:26 compute-0 sudo[157883]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:26 compute-0 sudo[158036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hermjjtenjqjglmzcqmynzxaspaezvbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348886.149839-1200-253975848217512/AnsiballZ_stat.py'
Feb 17 17:21:26 compute-0 sudo[158036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:26 compute-0 python3.9[158039]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:21:26 compute-0 sudo[158036]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:26 compute-0 sudo[158115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-saelzhqukbsybipnskkkbzkopcykilgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348886.149839-1200-253975848217512/AnsiballZ_file.py'
Feb 17 17:21:26 compute-0 sudo[158115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:26 compute-0 python3.9[158118]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:27 compute-0 sudo[158115]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:27 compute-0 sudo[158268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btqurjtbvyjraygxwersnilgksfgsion ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348887.1320388-1212-13602908843942/AnsiballZ_stat.py'
Feb 17 17:21:27 compute-0 sudo[158268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:27 compute-0 python3.9[158271]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:21:27 compute-0 sudo[158268]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:27 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Feb 17 17:21:27 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.044s CPU time.
Feb 17 17:21:27 compute-0 sudo[158394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyoeooiqamvrjimvjykrbxuaykjglfhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348887.1320388-1212-13602908843942/AnsiballZ_copy.py'
Feb 17 17:21:27 compute-0 sudo[158394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:27 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Feb 17 17:21:28 compute-0 python3.9[158397]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348887.1320388-1212-13602908843942/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:28 compute-0 sudo[158394]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:28 compute-0 sudo[158547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udcpvhvezsjxcnfplnydgakmogezyhnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348888.1863923-1227-219835846677788/AnsiballZ_stat.py'
Feb 17 17:21:28 compute-0 sudo[158547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:28 compute-0 python3.9[158550]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:21:28 compute-0 sudo[158547]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:28 compute-0 sudo[158626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkqjsmrivmwnxgayxvztzlvscnjdgkdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348888.1863923-1227-219835846677788/AnsiballZ_file.py'
Feb 17 17:21:28 compute-0 sudo[158626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:28 compute-0 python3.9[158629]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:29 compute-0 sudo[158626]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:29 compute-0 sudo[158779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcfcepnlkdrykqxxliubawtwwadgmoeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348889.144583-1239-148811366068865/AnsiballZ_stat.py'
Feb 17 17:21:29 compute-0 sudo[158779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:29 compute-0 python3.9[158782]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:21:29 compute-0 sudo[158779]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:29 compute-0 sudo[158858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muysnrwhfkbasduhyrjdvwdznlspulic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348889.144583-1239-148811366068865/AnsiballZ_file.py'
Feb 17 17:21:29 compute-0 sudo[158858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:29 compute-0 python3.9[158861]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:30 compute-0 sudo[158858]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:30 compute-0 sudo[159011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arqbaxuxzyazqaydholeowyykprsrhkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348890.1500862-1251-160153507160773/AnsiballZ_stat.py'
Feb 17 17:21:30 compute-0 sudo[159011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:30 compute-0 python3.9[159014]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:21:30 compute-0 sudo[159011]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:30 compute-0 sudo[159137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efbuqlmowlzixfxdjeqypgxkkzkztzjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348890.1500862-1251-160153507160773/AnsiballZ_copy.py'
Feb 17 17:21:30 compute-0 sudo[159137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:31 compute-0 python3.9[159140]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771348890.1500862-1251-160153507160773/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:31 compute-0 sudo[159137]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:31 compute-0 sudo[159290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyplwimbiinastvsrakqkrszhjmfcjlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348891.319695-1266-217301893487971/AnsiballZ_file.py'
Feb 17 17:21:31 compute-0 sudo[159290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:31 compute-0 python3.9[159293]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:31 compute-0 sudo[159290]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:32 compute-0 sudo[159443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siqayaqafvktqumbzqkiyauznyakjkgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348892.084426-1274-81626549395266/AnsiballZ_command.py'
Feb 17 17:21:32 compute-0 sudo[159443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:32 compute-0 python3.9[159446]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:21:32 compute-0 sudo[159443]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:33 compute-0 sudo[159599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhwvycmqclgfyxoynzupsqhgapqiwgjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348892.658957-1282-213278522666564/AnsiballZ_blockinfile.py'
Feb 17 17:21:33 compute-0 sudo[159599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:33 compute-0 python3.9[159602]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:33 compute-0 sudo[159599]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:33 compute-0 sudo[159752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjypxwrhvgmrqxmsfrjbgyyogdfoorvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348893.4885845-1291-143570721521863/AnsiballZ_command.py'
Feb 17 17:21:33 compute-0 sudo[159752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:33 compute-0 python3.9[159755]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:21:33 compute-0 sudo[159752]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:34 compute-0 sudo[159906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chzaindtuqrtbmuddoajhfgkrcsqamvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348894.1082828-1299-69015441837042/AnsiballZ_stat.py'
Feb 17 17:21:34 compute-0 sudo[159906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:34 compute-0 python3.9[159909]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:21:34 compute-0 sudo[159906]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:34 compute-0 sudo[160061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtbofkrddumpzqlqusacsbocabikjqwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348894.6939142-1307-280898518533384/AnsiballZ_command.py'
Feb 17 17:21:34 compute-0 sudo[160061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:35 compute-0 python3.9[160064]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:21:35 compute-0 sudo[160061]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:35 compute-0 sudo[160217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldtiiqicirolujgfhmlgoigeqwcughkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348895.2670999-1315-249655731068802/AnsiballZ_file.py'
Feb 17 17:21:35 compute-0 sudo[160217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:35 compute-0 python3.9[160220]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:35 compute-0 sudo[160217]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:36 compute-0 sudo[160370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufrgcznyvhcrqapgkunlnfhvcpvwzamc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348895.823114-1323-235038000128872/AnsiballZ_stat.py'
Feb 17 17:21:36 compute-0 sudo[160370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:36 compute-0 python3.9[160373]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:21:36 compute-0 sudo[160370]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:36 compute-0 sudo[160494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spioptujboggffxwxsukszemslaispeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348895.823114-1323-235038000128872/AnsiballZ_copy.py'
Feb 17 17:21:36 compute-0 sudo[160494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:36 compute-0 python3.9[160497]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771348895.823114-1323-235038000128872/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:36 compute-0 sudo[160494]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:37 compute-0 sudo[160647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljvhddaiypzcnfzlikcglwrisqbokbvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348896.8613088-1338-35787497196013/AnsiballZ_stat.py'
Feb 17 17:21:37 compute-0 sudo[160647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:37 compute-0 python3.9[160650]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:21:37 compute-0 sudo[160647]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:37 compute-0 sudo[160771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmyvotcvijkovoxhzlxhcmdmzqsjnttm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348896.8613088-1338-35787497196013/AnsiballZ_copy.py'
Feb 17 17:21:37 compute-0 sudo[160771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:37 compute-0 python3.9[160774]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771348896.8613088-1338-35787497196013/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:37 compute-0 sudo[160771]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:38 compute-0 sudo[160924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etsiefpwxboehcfljnbkmlxmdphocfit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348897.8755221-1353-49149479830771/AnsiballZ_stat.py'
Feb 17 17:21:38 compute-0 sudo[160924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:38 compute-0 python3.9[160927]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:21:38 compute-0 sudo[160924]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:38 compute-0 sudo[161048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbdsrahmzxkhbrxfoiykkcgemseqlvfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348897.8755221-1353-49149479830771/AnsiballZ_copy.py'
Feb 17 17:21:38 compute-0 sudo[161048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:38 compute-0 python3.9[161051]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771348897.8755221-1353-49149479830771/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:21:38 compute-0 sudo[161048]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:39 compute-0 sudo[161201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhrillwyklzfziqxckmcnupmqoetlwru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348898.9492898-1368-228023830110132/AnsiballZ_systemd.py'
Feb 17 17:21:39 compute-0 sudo[161201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:39 compute-0 python3.9[161204]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:21:39 compute-0 systemd[1]: Reloading.
Feb 17 17:21:39 compute-0 systemd-sysv-generator[161235]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:21:39 compute-0 systemd-rc-local-generator[161227]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:21:39 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Feb 17 17:21:39 compute-0 sudo[161201]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:40 compute-0 sudo[161400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yynyysvcdbfybyrfxsyiufrouylpmcdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348899.8879666-1376-158678215508333/AnsiballZ_systemd.py'
Feb 17 17:21:40 compute-0 sudo[161400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:40 compute-0 python3.9[161403]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 17 17:21:40 compute-0 systemd[1]: Reloading.
Feb 17 17:21:40 compute-0 systemd-rc-local-generator[161428]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:21:40 compute-0 systemd-sysv-generator[161431]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:21:40 compute-0 systemd[1]: Reloading.
Feb 17 17:21:40 compute-0 systemd-rc-local-generator[161471]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:21:40 compute-0 systemd-sysv-generator[161474]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:21:40 compute-0 sudo[161400]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:41 compute-0 sshd-session[106457]: Connection closed by 192.168.122.30 port 36840
Feb 17 17:21:41 compute-0 sshd-session[106454]: pam_unix(sshd:session): session closed for user zuul
Feb 17 17:21:41 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Feb 17 17:21:41 compute-0 systemd[1]: session-22.scope: Consumed 2min 45.402s CPU time.
Feb 17 17:21:41 compute-0 systemd-logind[806]: Session 22 logged out. Waiting for processes to exit.
Feb 17 17:21:41 compute-0 systemd-logind[806]: Removed session 22.
Feb 17 17:21:41 compute-0 podman[161514]: 2026-02-17 17:21:41.753598501 +0000 UTC m=+0.093521305 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 17 17:21:46 compute-0 sshd-session[161534]: Accepted publickey for zuul from 192.168.122.30 port 54016 ssh2: ECDSA SHA256:+U7vSwQZvFvLlKGEfHWMlLBUV1LwgLTIxJVbiTi7h3A
Feb 17 17:21:46 compute-0 systemd-logind[806]: New session 23 of user zuul.
Feb 17 17:21:46 compute-0 systemd[1]: Started Session 23 of User zuul.
Feb 17 17:21:46 compute-0 sshd-session[161534]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 17:21:47 compute-0 python3.9[161687]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:21:48 compute-0 python3.9[161841]: ansible-ansible.builtin.service_facts Invoked
Feb 17 17:21:48 compute-0 network[161858]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 17 17:21:48 compute-0 network[161859]: 'network-scripts' will be removed from distribution in near future.
Feb 17 17:21:48 compute-0 network[161860]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 17 17:21:51 compute-0 sudo[162130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkpwafwcnduqzlnjilkxeogrbfndarqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348910.9482765-42-119588937121864/AnsiballZ_setup.py'
Feb 17 17:21:51 compute-0 sudo[162130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:51 compute-0 python3.9[162133]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 17 17:21:51 compute-0 sudo[162130]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:52 compute-0 sudo[162215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpjvprlvxjuaordjcpjqchwhaqplubtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348910.9482765-42-119588937121864/AnsiballZ_dnf.py'
Feb 17 17:21:52 compute-0 sudo[162215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:52 compute-0 python3.9[162218]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 17 17:21:53 compute-0 podman[162220]: 2026-02-17 17:21:53.750715827 +0000 UTC m=+0.090763339 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 17 17:21:57 compute-0 sudo[162215]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:57 compute-0 sudo[162398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzvztedaajtozkwwittnnorzomodrnzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348917.5945141-54-2843123066305/AnsiballZ_stat.py'
Feb 17 17:21:57 compute-0 sudo[162398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:58 compute-0 python3.9[162401]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:21:58 compute-0 sudo[162398]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:58 compute-0 sudo[162551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bunpxeznfzkquxmhgshoglkyjkadhqhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348918.3499064-64-113854343567540/AnsiballZ_command.py'
Feb 17 17:21:58 compute-0 sudo[162551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:58 compute-0 python3.9[162554]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:21:58 compute-0 sudo[162551]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:59 compute-0 sudo[162705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwhuddexejhfvwnnalruemarctqadoyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348919.177545-74-105991222696780/AnsiballZ_stat.py'
Feb 17 17:21:59 compute-0 sudo[162705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:21:59 compute-0 python3.9[162708]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:21:59 compute-0 sudo[162705]: pam_unix(sudo:session): session closed for user root
Feb 17 17:21:59 compute-0 sudo[162858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqefdvuaoxkuazdtauqtocwpddldkphq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348919.7349498-82-135388174283786/AnsiballZ_command.py'
Feb 17 17:21:59 compute-0 sudo[162858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:00 compute-0 python3.9[162861]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:22:00 compute-0 sudo[162858]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:00 compute-0 sudo[163012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfqpkmzizteuitdaaghllpkoxixleqwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348920.271479-90-270369136088172/AnsiballZ_stat.py'
Feb 17 17:22:00 compute-0 sudo[163012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:00 compute-0 python3.9[163015]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:22:00 compute-0 sudo[163012]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:01 compute-0 sudo[163136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ledzuxfplvcxlgtsavixttplmymwsdhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348920.271479-90-270369136088172/AnsiballZ_copy.py'
Feb 17 17:22:01 compute-0 sudo[163136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:01 compute-0 python3.9[163139]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771348920.271479-90-270369136088172/.source.iscsi _original_basename=.vj12rvcz follow=False checksum=931476ba9019a9970a79142c406fb9edc952c3bc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:01 compute-0 sudo[163136]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:01 compute-0 sudo[163289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igyqdgvedysmpshgfgbjgvhiytftzedp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348921.4421182-105-91108186587576/AnsiballZ_file.py'
Feb 17 17:22:01 compute-0 sudo[163289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:01 compute-0 python3.9[163292]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:01 compute-0 sudo[163289]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:02 compute-0 sudo[163442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyxbpdeoqgkasmgkosqdtfklnbktdgal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348922.1285021-113-143555730485169/AnsiballZ_lineinfile.py'
Feb 17 17:22:02 compute-0 sudo[163442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:02 compute-0 python3.9[163445]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:02 compute-0 sudo[163442]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:03 compute-0 sudo[163595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afxkndngeqjbtxirzcgpoyhtbgyihgik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348922.995007-122-85833006675020/AnsiballZ_systemd_service.py'
Feb 17 17:22:03 compute-0 sudo[163595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:04 compute-0 python3.9[163598]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:22:04 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Feb 17 17:22:04 compute-0 sudo[163595]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:04 compute-0 sudo[163752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdsbvrijaopbvqrwspxitresoiidewer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348924.2541645-130-254821141989692/AnsiballZ_systemd_service.py'
Feb 17 17:22:04 compute-0 sudo[163752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:04 compute-0 python3.9[163755]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:22:04 compute-0 systemd[1]: Reloading.
Feb 17 17:22:04 compute-0 systemd-sysv-generator[163784]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:22:04 compute-0 systemd-rc-local-generator[163779]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:22:05 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb 17 17:22:05 compute-0 systemd[1]: Starting Open-iSCSI...
Feb 17 17:22:05 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Feb 17 17:22:05 compute-0 systemd[1]: Started Open-iSCSI.
Feb 17 17:22:05 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Feb 17 17:22:05 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Feb 17 17:22:05 compute-0 sudo[163752]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:05 compute-0 python3.9[163961]: ansible-ansible.builtin.service_facts Invoked
Feb 17 17:22:05 compute-0 network[163978]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 17 17:22:05 compute-0 network[163979]: 'network-scripts' will be removed from distribution in near future.
Feb 17 17:22:05 compute-0 network[163980]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 17 17:22:08 compute-0 sudo[164250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dilzqrnbynzfknienmbdufiwaxjjcfiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348928.4287562-153-202022794212798/AnsiballZ_dnf.py'
Feb 17 17:22:08 compute-0 sudo[164250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:08 compute-0 python3.9[164253]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 17 17:22:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:22:10.934 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:22:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:22:10.935 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:22:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:22:10.935 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:22:11 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 17 17:22:11 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 17 17:22:11 compute-0 systemd[1]: Reloading.
Feb 17 17:22:11 compute-0 systemd-sysv-generator[164301]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:22:11 compute-0 systemd-rc-local-generator[164297]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:22:11 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 17 17:22:11 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 17 17:22:11 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 17 17:22:11 compute-0 systemd[1]: run-rc2d907142ab148018be64c82249d0f06.service: Deactivated successfully.
Feb 17 17:22:11 compute-0 sudo[164250]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:11 compute-0 podman[164434]: 2026-02-17 17:22:11.874780592 +0000 UTC m=+0.064794221 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127)
Feb 17 17:22:12 compute-0 sudo[164603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjdxgmakkoogfjroezmndzcodzsdzyhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348931.9504962-162-106275400235501/AnsiballZ_file.py'
Feb 17 17:22:12 compute-0 sudo[164603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:12 compute-0 python3.9[164606]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 17 17:22:12 compute-0 sudo[164603]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:12 compute-0 sudo[164756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbafbodbuepqppcndxmbdnrkxoseuyxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348932.5199497-170-221777662859144/AnsiballZ_modprobe.py'
Feb 17 17:22:12 compute-0 sudo[164756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:13 compute-0 python3.9[164759]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Feb 17 17:22:13 compute-0 sudo[164756]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:13 compute-0 sudo[164913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guxzrubwaaflxtshqulhlblsonfskgqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348933.248204-178-184425945627335/AnsiballZ_stat.py'
Feb 17 17:22:13 compute-0 sudo[164913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:13 compute-0 python3.9[164916]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:22:13 compute-0 sudo[164913]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:13 compute-0 sudo[165037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exxrggeayhmtcrnmqirpqqggafqxqcin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348933.248204-178-184425945627335/AnsiballZ_copy.py'
Feb 17 17:22:13 compute-0 sudo[165037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:14 compute-0 python3.9[165040]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771348933.248204-178-184425945627335/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:14 compute-0 sudo[165037]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:14 compute-0 sudo[165190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuqjwikzieshvmmmnhqloxovgppldxzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348934.3177776-194-161457533000972/AnsiballZ_lineinfile.py'
Feb 17 17:22:14 compute-0 sudo[165190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:15 compute-0 python3.9[165193]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:15 compute-0 sudo[165190]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:15 compute-0 sudo[165343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxcxxwfcyqvvjuncxrlqdureddipgogm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348935.1977248-202-16187270864766/AnsiballZ_systemd.py'
Feb 17 17:22:15 compute-0 sudo[165343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:16 compute-0 python3.9[165346]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 17 17:22:16 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 17 17:22:16 compute-0 systemd[1]: Stopped Load Kernel Modules.
Feb 17 17:22:16 compute-0 systemd[1]: Stopping Load Kernel Modules...
Feb 17 17:22:16 compute-0 systemd[1]: Starting Load Kernel Modules...
Feb 17 17:22:16 compute-0 systemd[1]: Finished Load Kernel Modules.
Feb 17 17:22:16 compute-0 sudo[165343]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:16 compute-0 sudo[165500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtwnjuavmturqxxzxarmimmybxaaatfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348936.245313-210-248109475118648/AnsiballZ_command.py'
Feb 17 17:22:16 compute-0 sudo[165500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:16 compute-0 python3.9[165503]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:22:16 compute-0 sudo[165500]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:17 compute-0 sudo[165654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oydgwreeyyfqkyrfgjgmvimkeonoituq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348936.9193635-220-72426134805656/AnsiballZ_stat.py'
Feb 17 17:22:17 compute-0 sudo[165654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:17 compute-0 python3.9[165657]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:22:17 compute-0 sudo[165654]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:17 compute-0 sshd-session[165658]: Invalid user admin from 209.38.233.161 port 55850
Feb 17 17:22:17 compute-0 sshd-session[165658]: Connection closed by invalid user admin 209.38.233.161 port 55850 [preauth]
Feb 17 17:22:17 compute-0 sudo[165809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxerhecqgmlcpwxciapenjletyznngxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348937.5158012-229-29665503311093/AnsiballZ_stat.py'
Feb 17 17:22:17 compute-0 sudo[165809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:17 compute-0 python3.9[165812]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:22:17 compute-0 sudo[165809]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:18 compute-0 sudo[165933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlkgonjmgqfsqsuvfpbiuvohzvajoqiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348937.5158012-229-29665503311093/AnsiballZ_copy.py'
Feb 17 17:22:18 compute-0 sudo[165933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:18 compute-0 python3.9[165936]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771348937.5158012-229-29665503311093/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:18 compute-0 sudo[165933]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:18 compute-0 sudo[166086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isbeyhczbepmafuwmdgusezegsqpwqim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348938.5683427-244-1966071433039/AnsiballZ_command.py'
Feb 17 17:22:18 compute-0 sudo[166086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:18 compute-0 python3.9[166089]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:22:18 compute-0 sudo[166086]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:19 compute-0 sudo[166240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heqdslditqikwbnweqeoeqezstnuliiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348939.066956-252-76349342547560/AnsiballZ_lineinfile.py'
Feb 17 17:22:19 compute-0 sudo[166240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:19 compute-0 python3.9[166243]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:19 compute-0 sudo[166240]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:19 compute-0 sudo[166393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwupakiypbsutmklfeovcjdtbsmgkcbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348939.5793917-260-53994755040924/AnsiballZ_replace.py'
Feb 17 17:22:20 compute-0 sudo[166393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:20 compute-0 python3.9[166396]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:20 compute-0 sudo[166393]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:20 compute-0 sudo[166546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unaqstojwapxeiltzcdahuytypjzfjsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348940.3237042-268-206197946870628/AnsiballZ_replace.py'
Feb 17 17:22:20 compute-0 sudo[166546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:20 compute-0 python3.9[166549]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:20 compute-0 sudo[166546]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:21 compute-0 sudo[166699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnwnsppmjsmdwsossezmgfjiluknxujz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348940.9288545-277-35830690013657/AnsiballZ_lineinfile.py'
Feb 17 17:22:21 compute-0 sudo[166699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:21 compute-0 python3.9[166702]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:21 compute-0 sudo[166699]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:21 compute-0 sudo[166852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jheayzzandfwksqhwrkijhujnnttldaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348941.4932663-277-153475524127680/AnsiballZ_lineinfile.py'
Feb 17 17:22:21 compute-0 sudo[166852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:21 compute-0 python3.9[166855]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:21 compute-0 sudo[166852]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:22 compute-0 sudo[167005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkhfsitamjfdvjfthokpelcwqokofkzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348942.0306494-277-262006392128835/AnsiballZ_lineinfile.py'
Feb 17 17:22:22 compute-0 sudo[167005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:22 compute-0 python3.9[167008]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:22 compute-0 sudo[167005]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:22 compute-0 sudo[167158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmjoywplrugqmiieemoflveezcvsryls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348942.5867453-277-151062371220747/AnsiballZ_lineinfile.py'
Feb 17 17:22:22 compute-0 sudo[167158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:22 compute-0 python3.9[167161]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:22 compute-0 sudo[167158]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:23 compute-0 sudo[167311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjtiefxjiymjmuprvtwpyfzjawfhtjhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348943.1343687-306-20172164630208/AnsiballZ_stat.py'
Feb 17 17:22:23 compute-0 sudo[167311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:23 compute-0 python3.9[167314]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:22:23 compute-0 sudo[167311]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:23 compute-0 sudo[167479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aiafuqcdubuxlitmqhlybppqnamedaus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348943.746683-314-94346534342647/AnsiballZ_command.py'
Feb 17 17:22:23 compute-0 sudo[167479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:24 compute-0 podman[167440]: 2026-02-17 17:22:24.048041656 +0000 UTC m=+0.099115221 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Feb 17 17:22:24 compute-0 python3.9[167490]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:22:24 compute-0 sudo[167479]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:24 compute-0 sudo[167647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amleeidyekjqmwywcelyeyrpbthtqcrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348944.4171567-323-213522744397686/AnsiballZ_systemd_service.py'
Feb 17 17:22:24 compute-0 sudo[167647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:24 compute-0 python3.9[167650]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:22:24 compute-0 systemd[1]: Listening on multipathd control socket.
Feb 17 17:22:24 compute-0 sudo[167647]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:25 compute-0 sudo[167804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpipkvlzxmfwkirbebsdblyijzhfbhra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348945.117044-331-33059609589894/AnsiballZ_systemd_service.py'
Feb 17 17:22:25 compute-0 sudo[167804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:25 compute-0 python3.9[167807]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:22:25 compute-0 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Feb 17 17:22:25 compute-0 udevadm[167812]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Feb 17 17:22:25 compute-0 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Feb 17 17:22:25 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 17 17:22:25 compute-0 multipathd[167815]: --------start up--------
Feb 17 17:22:25 compute-0 multipathd[167815]: read /etc/multipath.conf
Feb 17 17:22:25 compute-0 multipathd[167815]: path checkers start up
Feb 17 17:22:25 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 17 17:22:25 compute-0 sudo[167804]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:26 compute-0 sudo[167973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkkozotvjwysxebfdfhyyrifqsvoyyek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348946.1662624-343-151269517606298/AnsiballZ_file.py'
Feb 17 17:22:26 compute-0 sudo[167973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:26 compute-0 python3.9[167976]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 17 17:22:26 compute-0 sudo[167973]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:27 compute-0 sudo[168126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgvmacslgpntydjtirhdgjhibfjuucic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348946.8451278-351-10703367462194/AnsiballZ_modprobe.py'
Feb 17 17:22:27 compute-0 sudo[168126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:27 compute-0 python3.9[168129]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Feb 17 17:22:27 compute-0 kernel: Key type psk registered
Feb 17 17:22:27 compute-0 sudo[168126]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:27 compute-0 sudo[168288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-menxyqpxpmeadmpznrhkzbazowjtipsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348947.489935-359-3089773106047/AnsiballZ_stat.py'
Feb 17 17:22:27 compute-0 sudo[168288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:27 compute-0 python3.9[168291]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:22:27 compute-0 sudo[168288]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:28 compute-0 sudo[168412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eryzpkfzltatdbekijgppnelybcaqjex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348947.489935-359-3089773106047/AnsiballZ_copy.py'
Feb 17 17:22:28 compute-0 sudo[168412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:28 compute-0 python3.9[168415]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771348947.489935-359-3089773106047/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:28 compute-0 sudo[168412]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:28 compute-0 sudo[168565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyhycpdreuipacrmjechrwjnwvjoipbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348948.6553342-375-155517912181306/AnsiballZ_lineinfile.py'
Feb 17 17:22:28 compute-0 sudo[168565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:29 compute-0 python3.9[168568]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:29 compute-0 sudo[168565]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:29 compute-0 sudo[168718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wljhcfsncaojwekvsleztivjtoymumya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348949.1957824-383-87478143486111/AnsiballZ_systemd.py'
Feb 17 17:22:29 compute-0 sudo[168718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:29 compute-0 python3.9[168721]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 17 17:22:29 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 17 17:22:29 compute-0 systemd[1]: Stopped Load Kernel Modules.
Feb 17 17:22:29 compute-0 systemd[1]: Stopping Load Kernel Modules...
Feb 17 17:22:29 compute-0 systemd[1]: Starting Load Kernel Modules...
Feb 17 17:22:29 compute-0 systemd[1]: Finished Load Kernel Modules.
Feb 17 17:22:29 compute-0 sudo[168718]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:30 compute-0 sudo[168875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oltryshhfyfzftqoqsbfjuzmgjnskovl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348949.942915-391-110792245245168/AnsiballZ_dnf.py'
Feb 17 17:22:30 compute-0 sudo[168875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:30 compute-0 python3.9[168878]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 17 17:22:32 compute-0 systemd[1]: Reloading.
Feb 17 17:22:33 compute-0 systemd-rc-local-generator[168911]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:22:33 compute-0 systemd-sysv-generator[168917]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:22:33 compute-0 systemd[1]: Reloading.
Feb 17 17:22:33 compute-0 systemd-rc-local-generator[168944]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:22:33 compute-0 systemd-sysv-generator[168951]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:22:33 compute-0 systemd-logind[806]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 17 17:22:33 compute-0 systemd-logind[806]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 17 17:22:33 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 17 17:22:33 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 17 17:22:33 compute-0 systemd[1]: Reloading.
Feb 17 17:22:33 compute-0 systemd-sysv-generator[169048]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:22:33 compute-0 systemd-rc-local-generator[169045]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:22:33 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 17 17:22:34 compute-0 sudo[168875]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:34 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 17 17:22:34 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 17 17:22:34 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.141s CPU time.
Feb 17 17:22:34 compute-0 systemd[1]: run-rc7e40e56eab2497da28fab14f6626968.service: Deactivated successfully.
Feb 17 17:22:34 compute-0 sudo[170363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaamveoqopdmriwwuisvbnnrxwfecdld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348954.5682757-399-95917675115055/AnsiballZ_systemd_service.py'
Feb 17 17:22:34 compute-0 sudo[170363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:35 compute-0 python3.9[170366]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 17 17:22:35 compute-0 iscsid[163802]: iscsid shutting down.
Feb 17 17:22:35 compute-0 systemd[1]: Stopping Open-iSCSI...
Feb 17 17:22:35 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Feb 17 17:22:35 compute-0 systemd[1]: Stopped Open-iSCSI.
Feb 17 17:22:35 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb 17 17:22:35 compute-0 systemd[1]: Starting Open-iSCSI...
Feb 17 17:22:35 compute-0 systemd[1]: Started Open-iSCSI.
Feb 17 17:22:35 compute-0 sudo[170363]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:35 compute-0 sudo[170520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgxnymxlmbpjrnhshaltyketadyzkspx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348955.27809-407-237805022417550/AnsiballZ_systemd_service.py'
Feb 17 17:22:35 compute-0 sudo[170520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:35 compute-0 python3.9[170523]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 17 17:22:35 compute-0 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Feb 17 17:22:35 compute-0 multipathd[167815]: exit (signal)
Feb 17 17:22:35 compute-0 multipathd[167815]: --------shut down-------
Feb 17 17:22:35 compute-0 systemd[1]: multipathd.service: Deactivated successfully.
Feb 17 17:22:35 compute-0 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Feb 17 17:22:35 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 17 17:22:35 compute-0 multipathd[170529]: --------start up--------
Feb 17 17:22:35 compute-0 multipathd[170529]: read /etc/multipath.conf
Feb 17 17:22:35 compute-0 multipathd[170529]: path checkers start up
Feb 17 17:22:35 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 17 17:22:35 compute-0 sudo[170520]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:36 compute-0 python3.9[170687]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:22:37 compute-0 sudo[170841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oybpypoeovqivzxsxxezbxgqmwqarlft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348957.0126534-425-153924502512828/AnsiballZ_file.py'
Feb 17 17:22:37 compute-0 sudo[170841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:37 compute-0 python3.9[170844]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:37 compute-0 sudo[170841]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:38 compute-0 sudo[170994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-negsilclmkgsbckpapwdznqkwamvvphd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348957.7856023-436-205954511157581/AnsiballZ_systemd_service.py'
Feb 17 17:22:38 compute-0 sudo[170994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:38 compute-0 python3.9[170997]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 17 17:22:38 compute-0 systemd[1]: Reloading.
Feb 17 17:22:38 compute-0 systemd-rc-local-generator[171023]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:22:38 compute-0 systemd-sysv-generator[171028]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:22:38 compute-0 sudo[170994]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:39 compute-0 python3.9[171189]: ansible-ansible.builtin.service_facts Invoked
Feb 17 17:22:39 compute-0 network[171206]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 17 17:22:39 compute-0 network[171207]: 'network-scripts' will be removed from distribution in near future.
Feb 17 17:22:39 compute-0 network[171208]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 17 17:22:41 compute-0 podman[171305]: 2026-02-17 17:22:41.991461755 +0000 UTC m=+0.061654723 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 17 17:22:43 compute-0 sudo[171498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqhlxsdmivaalsigfymemoejupivruvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348963.5575736-455-182158154294527/AnsiballZ_systemd_service.py'
Feb 17 17:22:43 compute-0 sudo[171498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:44 compute-0 python3.9[171501]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:22:44 compute-0 sudo[171498]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:44 compute-0 sudo[171652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feyltaysjgnnkzzybwsrnxrbebferdrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348964.2064044-455-186673647350169/AnsiballZ_systemd_service.py'
Feb 17 17:22:44 compute-0 sudo[171652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:44 compute-0 python3.9[171655]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:22:44 compute-0 sudo[171652]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:45 compute-0 sudo[171806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmxoryxbjogsebcrlmvppmbvtsbhvulw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348964.9046745-455-208018214485559/AnsiballZ_systemd_service.py'
Feb 17 17:22:45 compute-0 sudo[171806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:45 compute-0 python3.9[171809]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:22:45 compute-0 sudo[171806]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:45 compute-0 sudo[171960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftcvmbjenvhfvshqjxhxskrzfcgunkib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348965.564726-455-96090614769973/AnsiballZ_systemd_service.py'
Feb 17 17:22:45 compute-0 sudo[171960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:46 compute-0 python3.9[171963]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:22:46 compute-0 sudo[171960]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:46 compute-0 sudo[172114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isyufqpteuvyhukwbcuiuunigdlikvhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348966.2050965-455-211741229566459/AnsiballZ_systemd_service.py'
Feb 17 17:22:46 compute-0 sudo[172114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:46 compute-0 python3.9[172117]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:22:46 compute-0 sudo[172114]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:47 compute-0 sudo[172268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyarglrywlywntpbncpepxtqcjgfiixi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348966.8703892-455-186141546260750/AnsiballZ_systemd_service.py'
Feb 17 17:22:47 compute-0 sudo[172268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:47 compute-0 python3.9[172271]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:22:47 compute-0 sudo[172268]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:47 compute-0 sudo[172422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnefmpomvyjebpghqgyoeamebiikvpiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348967.5772843-455-204868880206496/AnsiballZ_systemd_service.py'
Feb 17 17:22:47 compute-0 sudo[172422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:48 compute-0 python3.9[172425]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:22:48 compute-0 sudo[172422]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:48 compute-0 sudo[172576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyljgpsyrbnnmymkmjertfreyaeigkyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348968.2441742-455-33454073891815/AnsiballZ_systemd_service.py'
Feb 17 17:22:48 compute-0 sudo[172576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:48 compute-0 python3.9[172579]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:22:48 compute-0 sudo[172576]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:49 compute-0 sudo[172730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fndcmbmsxpbpbvepwmsbcxwqvycnmcfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348969.080399-514-172963105568960/AnsiballZ_file.py'
Feb 17 17:22:49 compute-0 sudo[172730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:49 compute-0 python3.9[172733]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:49 compute-0 sudo[172730]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:49 compute-0 sudo[172883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plyvykghikqqlbrdcqeuyackhnowlffl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348969.5762777-514-254483009396857/AnsiballZ_file.py'
Feb 17 17:22:49 compute-0 sudo[172883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:49 compute-0 python3.9[172886]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:49 compute-0 sudo[172883]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:50 compute-0 sudo[173036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssrbonwyybsncfykxcxvtyeeihlvhmeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348970.0959706-514-89574907838242/AnsiballZ_file.py'
Feb 17 17:22:50 compute-0 sudo[173036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:50 compute-0 python3.9[173039]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:50 compute-0 sudo[173036]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:50 compute-0 sudo[173189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uprbovgwxhjriskzmeldrympieelbmht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348970.6101508-514-174594760813078/AnsiballZ_file.py'
Feb 17 17:22:50 compute-0 sudo[173189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:51 compute-0 python3.9[173192]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:51 compute-0 sudo[173189]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:51 compute-0 sudo[173342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhlkatoklrzxfmxlzglzsjwlalacvxfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348971.1377811-514-168474536228133/AnsiballZ_file.py'
Feb 17 17:22:51 compute-0 sudo[173342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:51 compute-0 python3.9[173345]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:51 compute-0 sudo[173342]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:52 compute-0 sudo[173495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlwmprpdpypsjgieeorqaoanqdxknult ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348971.8041215-514-118594259030809/AnsiballZ_file.py'
Feb 17 17:22:52 compute-0 sudo[173495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:52 compute-0 python3.9[173498]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:52 compute-0 sudo[173495]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:52 compute-0 sudo[173648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnlxdhtvrtsoypbyjtfoagpozckgimed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348972.3481703-514-212715555976485/AnsiballZ_file.py'
Feb 17 17:22:52 compute-0 sudo[173648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:52 compute-0 python3.9[173651]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:52 compute-0 sudo[173648]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:53 compute-0 sudo[173801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntcazhearjxikvtarnbooftgpgmjpuii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348972.8513596-514-139692849762668/AnsiballZ_file.py'
Feb 17 17:22:53 compute-0 sudo[173801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:53 compute-0 python3.9[173804]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:53 compute-0 sudo[173801]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:53 compute-0 sudo[173954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdxvxetrtrohwssoepjbtqtvcbetnpht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348973.409215-571-17377548995004/AnsiballZ_file.py'
Feb 17 17:22:53 compute-0 sudo[173954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:53 compute-0 python3.9[173957]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:53 compute-0 sudo[173954]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:54 compute-0 sudo[174120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgokszukmetnpipybewlrmejljckvyvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348973.9534855-571-5286745834330/AnsiballZ_file.py'
Feb 17 17:22:54 compute-0 sudo[174120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:54 compute-0 podman[174081]: 2026-02-17 17:22:54.302098526 +0000 UTC m=+0.132388486 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 17 17:22:54 compute-0 python3.9[174129]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:54 compute-0 sudo[174120]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:54 compute-0 sudo[174286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwmscfrrrpgekhitoeofboerjctvbwdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348974.5849555-571-193891253642806/AnsiballZ_file.py'
Feb 17 17:22:54 compute-0 sudo[174286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:55 compute-0 python3.9[174289]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:55 compute-0 sudo[174286]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:55 compute-0 sudo[174439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsnnhelpwymmdfoduiegoqllhxesgbgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348975.1463869-571-266069735632143/AnsiballZ_file.py'
Feb 17 17:22:55 compute-0 sudo[174439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:55 compute-0 python3.9[174442]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:55 compute-0 sudo[174439]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:55 compute-0 sudo[174592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgusvtcqbuwsirtujewemjpbcsgnrwzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348975.7042086-571-178897114623711/AnsiballZ_file.py'
Feb 17 17:22:55 compute-0 sudo[174592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:56 compute-0 python3.9[174595]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:56 compute-0 sudo[174592]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:56 compute-0 sudo[174745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zngkwdgcuazbcadlczqapdnsghjqzhzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348976.246583-571-87672490460580/AnsiballZ_file.py'
Feb 17 17:22:56 compute-0 sudo[174745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:56 compute-0 python3.9[174748]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:56 compute-0 sudo[174745]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:57 compute-0 sudo[174898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfamvwkxamlicoftoivjyoxbmzypageb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348976.8310406-571-266630238132236/AnsiballZ_file.py'
Feb 17 17:22:57 compute-0 sudo[174898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:57 compute-0 python3.9[174901]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:57 compute-0 sudo[174898]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:57 compute-0 sudo[175051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsmakrnxdvgpscasypykionnlxaeliaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348977.3405216-571-198581648842847/AnsiballZ_file.py'
Feb 17 17:22:57 compute-0 sudo[175051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:57 compute-0 python3.9[175054]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:22:57 compute-0 sudo[175051]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:58 compute-0 sudo[175204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odsldvxwqgmmthuuipudytmklhbdinfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348977.9998393-629-255347708970145/AnsiballZ_command.py'
Feb 17 17:22:58 compute-0 sudo[175204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:22:58 compute-0 python3.9[175207]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:22:58 compute-0 sudo[175204]: pam_unix(sudo:session): session closed for user root
Feb 17 17:22:59 compute-0 python3.9[175359]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 17 17:22:59 compute-0 sudo[175509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hduimsrgtsfrehxuinbgyfgytdmratkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348979.5313275-647-264172022927115/AnsiballZ_systemd_service.py'
Feb 17 17:22:59 compute-0 sudo[175509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:00 compute-0 python3.9[175512]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 17 17:23:00 compute-0 systemd[1]: Reloading.
Feb 17 17:23:00 compute-0 systemd-rc-local-generator[175539]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:23:00 compute-0 systemd-sysv-generator[175544]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:23:00 compute-0 sudo[175509]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:00 compute-0 sudo[175704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcslkjdnrlxtlacgruuiywatvuhplbqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348980.4217095-655-119365734053166/AnsiballZ_command.py'
Feb 17 17:23:00 compute-0 sudo[175704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:00 compute-0 python3.9[175707]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:23:00 compute-0 sudo[175704]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:01 compute-0 sudo[175858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oppuftdtcxiggdofnskrrmfsetkwkgvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348981.000859-655-8563856006669/AnsiballZ_command.py'
Feb 17 17:23:01 compute-0 sudo[175858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:01 compute-0 python3.9[175861]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:23:01 compute-0 sudo[175858]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:01 compute-0 sudo[176012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofomidtptqozhxecpaexaeummpolaqqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348981.6733341-655-132222041482909/AnsiballZ_command.py'
Feb 17 17:23:01 compute-0 sudo[176012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:02 compute-0 python3.9[176015]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:23:02 compute-0 sudo[176012]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:02 compute-0 sudo[176166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmgopdmfpeovalrkrutlrhzqwpsbnvuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348982.195764-655-103311013743385/AnsiballZ_command.py'
Feb 17 17:23:02 compute-0 sudo[176166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:02 compute-0 python3.9[176169]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:23:02 compute-0 sudo[176166]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:02 compute-0 sudo[176320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvxlvlcyudroggfoogdxqmywsyvalodx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348982.7424128-655-181336698038443/AnsiballZ_command.py'
Feb 17 17:23:02 compute-0 sudo[176320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:03 compute-0 python3.9[176323]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:23:03 compute-0 sudo[176320]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:03 compute-0 sudo[176474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khqzlfcsvjmsamnatjdlkyhyojslhhng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348983.260315-655-223553100054062/AnsiballZ_command.py'
Feb 17 17:23:03 compute-0 sudo[176474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:03 compute-0 python3.9[176477]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:23:03 compute-0 sudo[176474]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:04 compute-0 sudo[176628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooyavbafdlyaoiaehqtjpgcdezoelvgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348983.8125353-655-4792822104242/AnsiballZ_command.py'
Feb 17 17:23:04 compute-0 sudo[176628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:04 compute-0 python3.9[176631]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:23:04 compute-0 sudo[176628]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:04 compute-0 sudo[176782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thmmtdnwtzeizildwehkwzlxnhdajfxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348984.29909-655-53097080496933/AnsiballZ_command.py'
Feb 17 17:23:04 compute-0 sudo[176782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:04 compute-0 python3.9[176785]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:23:04 compute-0 sudo[176782]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:05 compute-0 sudo[176936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yasmkgokdugwdcoehjevzxfrdjnulqpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348985.6105173-734-84703809623745/AnsiballZ_file.py'
Feb 17 17:23:05 compute-0 sudo[176936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:06 compute-0 python3.9[176939]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:23:06 compute-0 sudo[176936]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:06 compute-0 sudo[177089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbcdjchdzixaqmnxwofotalogxurfzfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348986.1710224-734-59224717965601/AnsiballZ_file.py'
Feb 17 17:23:06 compute-0 sudo[177089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:06 compute-0 python3.9[177092]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:23:06 compute-0 sudo[177089]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:06 compute-0 sudo[177242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgjjiotmqtpwtvgtshpydnxuoijmbyaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348986.772939-749-100663707516171/AnsiballZ_file.py'
Feb 17 17:23:06 compute-0 sudo[177242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:07 compute-0 python3.9[177245]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:23:07 compute-0 sudo[177242]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:07 compute-0 sudo[177395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdlyvzxrzwzozhntucnuhnvcbciinbwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348987.2797909-749-149843334839119/AnsiballZ_file.py'
Feb 17 17:23:07 compute-0 sudo[177395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:07 compute-0 python3.9[177398]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:23:07 compute-0 sudo[177395]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:08 compute-0 sudo[177548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajqjlopsjsttycmkuugmwaazqgerkqbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348987.8216817-749-65921317124222/AnsiballZ_file.py'
Feb 17 17:23:08 compute-0 sudo[177548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:08 compute-0 python3.9[177551]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:23:08 compute-0 sudo[177548]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:08 compute-0 sudo[177701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjjdsqeozqjulrpnnwizexfiagpbvjie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348988.3321092-749-244363602795013/AnsiballZ_file.py'
Feb 17 17:23:08 compute-0 sudo[177701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:08 compute-0 python3.9[177704]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:23:08 compute-0 sudo[177701]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:09 compute-0 sudo[177854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nolozsizdzinrfolranrodsosvppfqlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348988.8161314-749-26775580652978/AnsiballZ_file.py'
Feb 17 17:23:09 compute-0 sudo[177854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:09 compute-0 python3.9[177857]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:23:09 compute-0 sudo[177854]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:09 compute-0 sudo[178007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blrdokiiruiogszajycvvhuftdpkumuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348989.3379154-749-196519850188595/AnsiballZ_file.py'
Feb 17 17:23:09 compute-0 sudo[178007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:09 compute-0 python3.9[178010]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:23:09 compute-0 sudo[178007]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:10 compute-0 sudo[178160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrpfzuvdljkkeuexriexmwvupogiaesu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348989.8486958-749-78794074754561/AnsiballZ_file.py'
Feb 17 17:23:10 compute-0 sudo[178160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:10 compute-0 python3.9[178163]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:23:10 compute-0 sudo[178160]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:23:10.936 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:23:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:23:10.937 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:23:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:23:10.938 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:23:12 compute-0 podman[178188]: 2026-02-17 17:23:12.701719948 +0000 UTC m=+0.046439399 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 17 17:23:14 compute-0 sudo[178332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgaiymjxqbfamzzfmijxapxwvckvyevh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348994.2946975-918-25180306763121/AnsiballZ_getent.py'
Feb 17 17:23:14 compute-0 sudo[178332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:14 compute-0 python3.9[178335]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Feb 17 17:23:14 compute-0 sudo[178332]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:15 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Feb 17 17:23:15 compute-0 sudo[178487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-norzpskpryfiacykbkpaiedsdvopjpoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348995.0095549-926-74940302865754/AnsiballZ_group.py'
Feb 17 17:23:15 compute-0 sudo[178487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:15 compute-0 python3.9[178490]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 17 17:23:15 compute-0 groupadd[178491]: group added to /etc/group: name=nova, GID=42436
Feb 17 17:23:15 compute-0 groupadd[178491]: group added to /etc/gshadow: name=nova
Feb 17 17:23:15 compute-0 groupadd[178491]: new group: name=nova, GID=42436
Feb 17 17:23:15 compute-0 sudo[178487]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:16 compute-0 sudo[178646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fplyyrvpqcysmflxptmzwdsavifurafu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771348995.8629563-934-49651488282883/AnsiballZ_user.py'
Feb 17 17:23:16 compute-0 sudo[178646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:16 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 17 17:23:16 compute-0 python3.9[178649]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 17 17:23:16 compute-0 useradd[178652]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/1
Feb 17 17:23:16 compute-0 useradd[178652]: add 'nova' to group 'libvirt'
Feb 17 17:23:16 compute-0 useradd[178652]: add 'nova' to shadow group 'libvirt'
Feb 17 17:23:16 compute-0 sudo[178646]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:17 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Feb 17 17:23:17 compute-0 sshd-session[178684]: Accepted publickey for zuul from 192.168.122.30 port 56884 ssh2: ECDSA SHA256:+U7vSwQZvFvLlKGEfHWMlLBUV1LwgLTIxJVbiTi7h3A
Feb 17 17:23:17 compute-0 systemd-logind[806]: New session 24 of user zuul.
Feb 17 17:23:17 compute-0 systemd[1]: Started Session 24 of User zuul.
Feb 17 17:23:17 compute-0 sshd-session[178684]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 17:23:17 compute-0 sshd-session[178687]: Received disconnect from 192.168.122.30 port 56884:11: disconnected by user
Feb 17 17:23:17 compute-0 sshd-session[178687]: Disconnected from user zuul 192.168.122.30 port 56884
Feb 17 17:23:17 compute-0 sshd-session[178684]: pam_unix(sshd:session): session closed for user zuul
Feb 17 17:23:17 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Feb 17 17:23:17 compute-0 systemd-logind[806]: Session 24 logged out. Waiting for processes to exit.
Feb 17 17:23:17 compute-0 systemd-logind[806]: Removed session 24.
Feb 17 17:23:18 compute-0 python3.9[178837]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:23:18 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Feb 17 17:23:18 compute-0 python3.9[178914]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:23:19 compute-0 python3.9[179064]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:23:19 compute-0 python3.9[179185]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771348998.731386-959-38888055786247/.source _original_basename=ssh-config follow=False checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:23:20 compute-0 python3.9[179335]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:23:20 compute-0 python3.9[179458]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771348999.6770182-959-51518118465937/.source.py _original_basename=nova_statedir_ownership.py follow=False checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:23:20 compute-0 sshd-session[179406]: Received disconnect from 45.148.10.152 port 62302:11:  [preauth]
Feb 17 17:23:20 compute-0 sshd-session[179406]: Disconnected from authenticating user root 45.148.10.152 port 62302 [preauth]
Feb 17 17:23:20 compute-0 python3.9[179608]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:23:21 compute-0 python3.9[179729]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771349000.5843449-959-33876887669893/.source _original_basename=run-on-host follow=False checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:23:21 compute-0 python3.9[179879]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:23:22 compute-0 python3.9[180000]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771349001.5802453-1013-199999735757557/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:23:22 compute-0 sudo[180150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjlxypvsjxrbrieesqqabzuxgwzydamd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349002.6197538-1028-169068732651374/AnsiballZ_file.py'
Feb 17 17:23:22 compute-0 sudo[180150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:23 compute-0 python3.9[180153]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:23:23 compute-0 sudo[180150]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:23 compute-0 sudo[180303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwsocfexbrbjagapwcgfciveolthjizm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349003.1988502-1036-56996162564730/AnsiballZ_copy.py'
Feb 17 17:23:23 compute-0 sudo[180303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:23 compute-0 python3.9[180306]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:23:23 compute-0 sudo[180303]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:23 compute-0 sudo[180456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkhsalfpvxwecthtyyksjkdabmgxssbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349003.7471826-1044-157580861888217/AnsiballZ_stat.py'
Feb 17 17:23:23 compute-0 sudo[180456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:24 compute-0 python3.9[180459]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:23:24 compute-0 sudo[180456]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:24 compute-0 sudo[180621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwufmzcmxuvihoyirdumdatneyjiumns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349004.2753932-1052-178855858616608/AnsiballZ_stat.py'
Feb 17 17:23:24 compute-0 sudo[180621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:24 compute-0 podman[180583]: 2026-02-17 17:23:24.554285318 +0000 UTC m=+0.072908609 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 17 17:23:24 compute-0 python3.9[180629]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:23:24 compute-0 sudo[180621]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:25 compute-0 sudo[180759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idlznhkiklhuslmkusajqjcbtyqasktq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349004.2753932-1052-178855858616608/AnsiballZ_copy.py'
Feb 17 17:23:25 compute-0 sudo[180759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:25 compute-0 python3.9[180762]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1771349004.2753932-1052-178855858616608/.source _original_basename=.qb_98etx follow=False checksum=9002022ed8636be8f08cc3de1bb954ff0b29e32e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Feb 17 17:23:25 compute-0 sudo[180759]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:25 compute-0 python3.9[180914]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:23:26 compute-0 sudo[181068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egfciujdzrmzzfsvbunyicqdrpkaxgyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349006.0849605-1080-91505407021534/AnsiballZ_file.py'
Feb 17 17:23:26 compute-0 sudo[181068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:26 compute-0 python3.9[181071]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:23:26 compute-0 sshd-session[180917]: Invalid user admin from 209.38.233.161 port 57488
Feb 17 17:23:26 compute-0 sudo[181068]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:26 compute-0 sshd-session[180917]: Connection closed by invalid user admin 209.38.233.161 port 57488 [preauth]
Feb 17 17:23:26 compute-0 sudo[181221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqjijzbdyskqyupnqxbtizfinhprcdoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349006.7007647-1088-99025316256090/AnsiballZ_file.py'
Feb 17 17:23:26 compute-0 sudo[181221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:27 compute-0 python3.9[181224]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:23:27 compute-0 sudo[181221]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:27 compute-0 python3.9[181374]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute_init state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:23:29 compute-0 sudo[181795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osruxnvsxcbwsbxlzldauvovdqakwdpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349009.029894-1122-144670658536074/AnsiballZ_container_config_data.py'
Feb 17 17:23:29 compute-0 sudo[181795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:29 compute-0 python3.9[181798]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute_init config_pattern=*.json debug=False
Feb 17 17:23:29 compute-0 sudo[181795]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:30 compute-0 sudo[181948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpyflqugjqlgigcuxokidulyidxeihlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349009.982046-1133-51263911660303/AnsiballZ_container_config_hash.py'
Feb 17 17:23:30 compute-0 sudo[181948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:30 compute-0 python3.9[181951]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 17 17:23:30 compute-0 sudo[181948]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:31 compute-0 sudo[182101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgiqxnqoeancxdkvzplwzlkkrrvtccgg ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771349010.7998137-1143-237128895683325/AnsiballZ_edpm_container_manage.py'
Feb 17 17:23:31 compute-0 sudo[182101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:31 compute-0 python3[182104]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute_init config_id=nova_compute_init config_overrides={} config_patterns=*.json containers=['nova_compute_init'] log_base_path=/var/log/containers/stdouts debug=False
Feb 17 17:23:31 compute-0 podman[182141]: 2026-02-17 17:23:31.636961531 +0000 UTC m=+0.045882138 container create b74bf5dd7c958e11ab63de705be83328e0fb77b84d1b51672159747f0e8a60c6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, container_name=nova_compute_init, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_id=nova_compute_init, managed_by=edpm_ansible, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '4e7c7c6595e2dce5f0465dd1d56d23fb04ee5f99dd9a749ec5aad6e79d24726a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 17 17:23:31 compute-0 podman[182141]: 2026-02-17 17:23:31.612107801 +0000 UTC m=+0.021028438 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 17 17:23:31 compute-0 python3[182104]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --env EDPM_CONFIG_HASH=4e7c7c6595e2dce5f0465dd1d56d23fb04ee5f99dd9a749ec5aad6e79d24726a --label config_id=nova_compute_init --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '4e7c7c6595e2dce5f0465dd1d56d23fb04ee5f99dd9a749ec5aad6e79d24726a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Feb 17 17:23:31 compute-0 sudo[182101]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:32 compute-0 sudo[182329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whrhdbcxhzochvejbquajkcrzehnyeot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349011.8924322-1151-106954241044947/AnsiballZ_stat.py'
Feb 17 17:23:32 compute-0 sudo[182329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:32 compute-0 python3.9[182332]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:23:32 compute-0 sudo[182329]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:33 compute-0 python3.9[182484]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 17 17:23:33 compute-0 sudo[182634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luvhypdumvhwzxceimznvqemdqqkhnaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349013.6671963-1178-7806056180145/AnsiballZ_stat.py'
Feb 17 17:23:33 compute-0 sudo[182634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:34 compute-0 python3.9[182637]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:23:34 compute-0 sudo[182634]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:34 compute-0 sudo[182760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgrmwivjlovylktuxcuvouzzwqpuqzzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349013.6671963-1178-7806056180145/AnsiballZ_copy.py'
Feb 17 17:23:34 compute-0 sudo[182760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:34 compute-0 python3.9[182763]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771349013.6671963-1178-7806056180145/.source.yaml _original_basename=.cmuw0v1v follow=False checksum=b3fdc261c07339344d0d907649ae35f6fce585f0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:23:34 compute-0 sudo[182760]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:35 compute-0 sudo[182913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttvpvctlfoqynmtusionljhpwserpgvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349015.018707-1195-250199241794292/AnsiballZ_file.py'
Feb 17 17:23:35 compute-0 sudo[182913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:35 compute-0 python3.9[182916]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:23:35 compute-0 sudo[182913]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:35 compute-0 sudo[183066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsmgmgprpbyeibwdvyynufvkptmwudcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349015.6297758-1203-129753050713917/AnsiballZ_file.py'
Feb 17 17:23:35 compute-0 sudo[183066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:36 compute-0 python3.9[183069]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:23:36 compute-0 sudo[183066]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:36 compute-0 sudo[183219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrocnrsthzkpihzdxqzjaxchhlwuudat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349016.2952576-1211-262421816824547/AnsiballZ_stat.py'
Feb 17 17:23:36 compute-0 sudo[183219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:36 compute-0 python3.9[183222]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:23:36 compute-0 sudo[183219]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:37 compute-0 sudo[183343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzulxuejxdnlngncvlexsppsdftijizc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349016.2952576-1211-262421816824547/AnsiballZ_copy.py'
Feb 17 17:23:37 compute-0 sudo[183343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:37 compute-0 python3.9[183346]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/nova_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771349016.2952576-1211-262421816824547/.source.json _original_basename=.3tcl06s0 follow=False checksum=0018389a48392615f4a8869cad43008a907328ff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:23:37 compute-0 sudo[183343]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:37 compute-0 python3.9[183496]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:23:39 compute-0 sudo[183917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oydimthghmoiznqfucpfgkrjywuzxolw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349019.070219-1251-146210186978863/AnsiballZ_container_config_data.py'
Feb 17 17:23:39 compute-0 sudo[183917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:39 compute-0 python3.9[183920]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute config_pattern=*.json debug=False
Feb 17 17:23:39 compute-0 sudo[183917]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:39 compute-0 sudo[184070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxifrldazekjmcrbuakpsvnoslqkoznu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349019.8062437-1262-242843675145538/AnsiballZ_container_config_hash.py'
Feb 17 17:23:39 compute-0 sudo[184070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:40 compute-0 python3.9[184073]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 17 17:23:40 compute-0 sudo[184070]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:40 compute-0 sudo[184223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnvkjykvfdlrilknniwbajdjfgzmbntt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771349020.4963965-1272-243848510447349/AnsiballZ_edpm_container_manage.py'
Feb 17 17:23:40 compute-0 sudo[184223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:40 compute-0 python3[184226]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute config_id=nova_compute config_overrides={} config_patterns=*.json containers=['nova_compute'] log_base_path=/var/log/containers/stdouts debug=False
Feb 17 17:23:41 compute-0 podman[184260]: 2026-02-17 17:23:41.138268653 +0000 UTC m=+0.051902543 container create 94564f8bf7387bfa7fe7952fe8680cdc6df6ed06d10752ff67080cbea2320e42 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, org.label-schema.build-date=20260127, tcib_managed=true, config_id=nova_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-4e7c7c6595e2dce5f0465dd1d56d23fb04ee5f99dd9a749ec5aad6e79d24726a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible)
Feb 17 17:23:41 compute-0 podman[184260]: 2026-02-17 17:23:41.106009195 +0000 UTC m=+0.019643175 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 17 17:23:41 compute-0 python3[184226]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-4e7c7c6595e2dce5f0465dd1d56d23fb04ee5f99dd9a749ec5aad6e79d24726a --label config_id=nova_compute --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-4e7c7c6595e2dce5f0465dd1d56d23fb04ee5f99dd9a749ec5aad6e79d24726a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Feb 17 17:23:41 compute-0 sudo[184223]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:41 compute-0 sudo[184448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eofaqovicrvqgbdpcplufctaoeogparp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349021.3755343-1280-167617555340074/AnsiballZ_stat.py'
Feb 17 17:23:41 compute-0 sudo[184448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:41 compute-0 python3.9[184451]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:23:41 compute-0 sudo[184448]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:42 compute-0 sudo[184603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpqymlbbugsyzivulnawtfyddxmqtscs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349021.9701233-1289-157521658223966/AnsiballZ_file.py'
Feb 17 17:23:42 compute-0 sudo[184603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:42 compute-0 python3.9[184606]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:23:42 compute-0 sudo[184603]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:42 compute-0 sudo[184680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntciavftoqbroqllumysxpwzcbctrxst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349021.9701233-1289-157521658223966/AnsiballZ_stat.py'
Feb 17 17:23:42 compute-0 sudo[184680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:42 compute-0 python3.9[184683]: ansible-stat Invoked with path=/etc/systemd/system/edpm_nova_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:23:42 compute-0 sudo[184680]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:42 compute-0 podman[184684]: 2026-02-17 17:23:42.791668145 +0000 UTC m=+0.047481897 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 17 17:23:43 compute-0 sudo[184849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fygjnpysmixuhtmtmspdoummsaqkbqua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349022.7573383-1289-159005311569764/AnsiballZ_copy.py'
Feb 17 17:23:43 compute-0 sudo[184849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:43 compute-0 python3.9[184852]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771349022.7573383-1289-159005311569764/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:23:43 compute-0 sudo[184849]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:43 compute-0 sudo[184926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuyljdrdakstklttnhffnbcgrlfbgurc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349022.7573383-1289-159005311569764/AnsiballZ_systemd.py'
Feb 17 17:23:43 compute-0 sudo[184926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:43 compute-0 python3.9[184929]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 17 17:23:43 compute-0 systemd[1]: Reloading.
Feb 17 17:23:43 compute-0 systemd-sysv-generator[184964]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:23:43 compute-0 systemd-rc-local-generator[184959]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:23:43 compute-0 sudo[184926]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:44 compute-0 sudo[185046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usmyiycgmxvsowhzjnisgqefnguqbjfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349022.7573383-1289-159005311569764/AnsiballZ_systemd.py'
Feb 17 17:23:44 compute-0 sudo[185046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:44 compute-0 python3.9[185049]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:23:44 compute-0 systemd[1]: Reloading.
Feb 17 17:23:44 compute-0 systemd-sysv-generator[185082]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:23:44 compute-0 systemd-rc-local-generator[185074]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:23:44 compute-0 systemd[1]: Starting nova_compute container...
Feb 17 17:23:44 compute-0 systemd[1]: Started libcrun container.
Feb 17 17:23:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56b748ece283e12e48c41fbc5a0c6607a696198f6113e8de25a423de79d6f14b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 17 17:23:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56b748ece283e12e48c41fbc5a0c6607a696198f6113e8de25a423de79d6f14b/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 17 17:23:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56b748ece283e12e48c41fbc5a0c6607a696198f6113e8de25a423de79d6f14b/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 17 17:23:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56b748ece283e12e48c41fbc5a0c6607a696198f6113e8de25a423de79d6f14b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 17 17:23:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56b748ece283e12e48c41fbc5a0c6607a696198f6113e8de25a423de79d6f14b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 17 17:23:44 compute-0 podman[185095]: 2026-02-17 17:23:44.87926447 +0000 UTC m=+0.095227088 container init 94564f8bf7387bfa7fe7952fe8680cdc6df6ed06d10752ff67080cbea2320e42 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-4e7c7c6595e2dce5f0465dd1d56d23fb04ee5f99dd9a749ec5aad6e79d24726a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Feb 17 17:23:44 compute-0 podman[185095]: 2026-02-17 17:23:44.884435324 +0000 UTC m=+0.100397922 container start 94564f8bf7387bfa7fe7952fe8680cdc6df6ed06d10752ff67080cbea2320e42 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, config_id=nova_compute, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-4e7c7c6595e2dce5f0465dd1d56d23fb04ee5f99dd9a749ec5aad6e79d24726a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3)
Feb 17 17:23:44 compute-0 podman[185095]: nova_compute
Feb 17 17:23:44 compute-0 nova_compute[185110]: + sudo -E kolla_set_configs
Feb 17 17:23:44 compute-0 systemd[1]: Started nova_compute container.
Feb 17 17:23:44 compute-0 sudo[185046]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Validating config file
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Copying service configuration files
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Copying /var/lib/kolla/config_files/src/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Deleting /etc/ceph
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Creating directory /etc/ceph
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Setting permission for /etc/ceph
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Writing out command to execute
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 17 17:23:44 compute-0 nova_compute[185110]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 17 17:23:44 compute-0 nova_compute[185110]: ++ cat /run_command
Feb 17 17:23:44 compute-0 nova_compute[185110]: + CMD=nova-compute
Feb 17 17:23:44 compute-0 nova_compute[185110]: + ARGS=
Feb 17 17:23:44 compute-0 nova_compute[185110]: + sudo kolla_copy_cacerts
Feb 17 17:23:44 compute-0 nova_compute[185110]: + [[ ! -n '' ]]
Feb 17 17:23:44 compute-0 nova_compute[185110]: + . kolla_extend_start
Feb 17 17:23:44 compute-0 nova_compute[185110]: + echo 'Running command: '\''nova-compute'\'''
Feb 17 17:23:44 compute-0 nova_compute[185110]: Running command: 'nova-compute'
Feb 17 17:23:44 compute-0 nova_compute[185110]: + umask 0022
Feb 17 17:23:44 compute-0 nova_compute[185110]: + exec nova-compute
Feb 17 17:23:45 compute-0 python3.9[185271]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 17 17:23:46 compute-0 sudo[185422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqzwirnlsrjaezqmkvhbpxdbldprddeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349025.9004006-1334-1604065808875/AnsiballZ_stat.py'
Feb 17 17:23:46 compute-0 sudo[185422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:46 compute-0 python3.9[185425]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:23:46 compute-0 sudo[185422]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:46 compute-0 sudo[185548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldegjktbcwnyqfdkodhyqbftfpkkaltj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349025.9004006-1334-1604065808875/AnsiballZ_copy.py'
Feb 17 17:23:46 compute-0 sudo[185548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:46 compute-0 python3.9[185551]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771349025.9004006-1334-1604065808875/.source.yaml _original_basename=.pjkf49a7 follow=False checksum=0152844396d396c7f895f1d21870ada39527bd6a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:23:46 compute-0 sudo[185548]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:46 compute-0 nova_compute[185110]: 2026-02-17 17:23:46.931 185114 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 17 17:23:46 compute-0 nova_compute[185110]: 2026-02-17 17:23:46.931 185114 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 17 17:23:46 compute-0 nova_compute[185110]: 2026-02-17 17:23:46.931 185114 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 17 17:23:46 compute-0 nova_compute[185110]: 2026-02-17 17:23:46.931 185114 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.059 185114 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.067 185114 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.068 185114 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 17 17:23:47 compute-0 python3.9[185705]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.569 185114 INFO nova.virt.driver [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.672 185114 INFO nova.compute.provider_config [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.685 185114 DEBUG oslo_concurrency.lockutils [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.686 185114 DEBUG oslo_concurrency.lockutils [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.686 185114 DEBUG oslo_concurrency.lockutils [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.686 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.686 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.686 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.686 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.687 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.687 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.687 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.687 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.687 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.687 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.687 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.688 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.688 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.688 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.688 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.688 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.689 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.689 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.689 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.689 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.689 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.689 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.689 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.689 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.690 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.690 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.690 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.690 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.690 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.691 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.691 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.691 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.691 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.691 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.692 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.692 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.692 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.692 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.692 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.692 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.693 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.693 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.693 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.693 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.693 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.694 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.694 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.694 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.694 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.694 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.695 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.695 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.695 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.695 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.695 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.695 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.695 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.696 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.696 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.696 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.696 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.696 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.696 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.696 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.696 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.697 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.697 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.697 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.697 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.697 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.697 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.697 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.698 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.698 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.698 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.698 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.698 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.698 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.699 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.699 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.699 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.699 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.699 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.699 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.700 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.700 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.700 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.700 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.700 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.700 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.700 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.701 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.701 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.701 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.701 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.701 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.701 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.701 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.702 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.702 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.702 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.702 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.702 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.702 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.702 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.703 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.703 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.703 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.703 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.703 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.703 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.703 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.704 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.704 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.704 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.704 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.704 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.704 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.705 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.705 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.705 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.705 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.705 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.705 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.705 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.706 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.706 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.706 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.706 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.706 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.707 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.707 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.707 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.707 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.708 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.708 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.708 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.708 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.708 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.708 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.709 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.709 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.709 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.709 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.709 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.709 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.710 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.710 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.710 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.710 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.710 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.711 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.711 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.711 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.711 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.711 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.711 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.711 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.712 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.712 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.712 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.712 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.712 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.712 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.712 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.713 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.713 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.713 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.713 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.713 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.713 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.713 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.714 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.714 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.714 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.714 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.714 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.714 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.715 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.715 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.715 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.715 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.715 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.715 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.716 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.716 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.716 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.716 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.716 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.716 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.717 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.717 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.717 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.717 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.717 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.717 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.717 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.718 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.718 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.718 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.718 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.718 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.718 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.719 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.719 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.719 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.719 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.719 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.719 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.719 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.720 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.720 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.720 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.720 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.720 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.720 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.720 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.721 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.721 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.721 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.721 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.721 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.721 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.721 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.722 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.722 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.722 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.722 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.722 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.722 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.722 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.723 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.723 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.723 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.723 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.723 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.723 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.723 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.724 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.724 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.724 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.724 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.724 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.724 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.724 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.725 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.725 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.725 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.725 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.725 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.725 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.725 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.726 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.726 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.726 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.726 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.726 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.726 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.726 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.727 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.727 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.727 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.727 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.727 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.728 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.728 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.728 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.728 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.728 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.728 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.728 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.729 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.729 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.729 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.729 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.729 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.729 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.729 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.730 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.730 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.730 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.730 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.730 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.730 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.730 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.731 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.731 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.731 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.731 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.731 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.731 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.731 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.732 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.732 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.732 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.732 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.732 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.732 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.732 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.733 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.733 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.733 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.733 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.733 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.733 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.734 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.734 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.734 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.734 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.734 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.734 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.735 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.735 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.735 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.735 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.735 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.735 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.735 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.736 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.736 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.736 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.736 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.736 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.736 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.736 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.737 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.737 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.737 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.737 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.737 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.737 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.737 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.738 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.738 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.738 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.738 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.738 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.738 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.739 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.739 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.739 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.739 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.739 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.739 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.740 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.740 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.740 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.740 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.740 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.740 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.740 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.741 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.741 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.741 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.741 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.741 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.741 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.742 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.742 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.742 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.742 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.742 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.742 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.742 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.743 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.743 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.743 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.743 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.743 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.743 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.743 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.743 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.744 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.744 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.744 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.744 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.744 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.744 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.745 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.745 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.745 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.745 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.745 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.745 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.746 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.746 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.746 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.746 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.746 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.746 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.747 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.747 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.747 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.747 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.747 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.747 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.747 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.748 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.748 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.748 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.748 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.748 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.748 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.749 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.749 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.749 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.749 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.750 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.750 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.750 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.750 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.750 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.750 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.751 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.751 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.751 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.751 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.751 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.751 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.751 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.752 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.752 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.752 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.752 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.752 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.752 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.752 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.753 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.753 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.753 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.753 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.753 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.753 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.754 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.754 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.754 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.754 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.754 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.754 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.754 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.755 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.755 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.755 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.755 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.755 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.755 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.755 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.756 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.756 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.756 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.756 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.756 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.756 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.757 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.757 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.757 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.757 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.757 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.757 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.758 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.758 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.758 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.758 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.758 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.758 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.759 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.759 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.759 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.759 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.759 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.759 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.759 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.760 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.760 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.760 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.760 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.760 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.760 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.761 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.761 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.761 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.761 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.761 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.761 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.762 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.762 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.762 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.762 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.762 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.762 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.763 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.763 185114 WARNING oslo_config.cfg [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 17 17:23:47 compute-0 nova_compute[185110]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 17 17:23:47 compute-0 nova_compute[185110]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 17 17:23:47 compute-0 nova_compute[185110]: and ``live_migration_inbound_addr`` respectively.
Feb 17 17:23:47 compute-0 nova_compute[185110]: ).  Its value may be silently ignored in the future.
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.763 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.763 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.764 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.764 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.764 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.764 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.764 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.764 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.764 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.765 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.765 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.765 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.765 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.765 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.765 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.765 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.766 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.766 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.766 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.766 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.766 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.766 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.766 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.766 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.767 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.767 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.767 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.767 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.767 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.767 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.767 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.768 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.768 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.768 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.768 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.768 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.768 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.768 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.769 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.769 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.769 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.769 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.769 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.769 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.769 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.770 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.770 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.770 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.770 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.770 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.770 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.770 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.770 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.771 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.771 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.771 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.771 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.771 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.771 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.771 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.772 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.772 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.772 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.772 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.772 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.772 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.772 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.772 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.773 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.773 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.773 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.773 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.773 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.773 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.773 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.773 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.774 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.774 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.774 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.774 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.774 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.774 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.775 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.775 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.775 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.775 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.775 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.775 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.775 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.775 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.776 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.776 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.776 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.776 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.776 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.776 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.776 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.777 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.777 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.777 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.777 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.777 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.777 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.777 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.777 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.778 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.778 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.778 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.778 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.778 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.778 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.778 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.778 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.779 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.779 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.779 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.779 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.779 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.779 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.779 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.780 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.780 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.780 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.780 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.780 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.780 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.780 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.780 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.781 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.781 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.781 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.781 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.781 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.781 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.781 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.781 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.782 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.782 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.782 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.782 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.782 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.782 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.783 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.783 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.783 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.783 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.783 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.783 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.783 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.784 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.784 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.784 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.784 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.784 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.784 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.784 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.785 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.785 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.785 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.785 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.785 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.785 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.785 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.786 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.786 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.786 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.786 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.786 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.786 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.787 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.787 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.787 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.787 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.787 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.788 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.788 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.788 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.788 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.788 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.789 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.789 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.789 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.789 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.789 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.790 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.790 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.790 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.790 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.790 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.791 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.791 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.791 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.791 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.791 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.791 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.792 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.792 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.792 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.792 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.793 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.793 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.793 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.793 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.793 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.794 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.794 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.794 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.794 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.794 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.794 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.795 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.795 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.795 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.795 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.795 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.796 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.796 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.796 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.796 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.796 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.797 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.797 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.797 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.797 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.797 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.797 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.798 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.798 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.798 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.798 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.798 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.799 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.799 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.799 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.799 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.799 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.799 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.800 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.800 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.800 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.800 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.800 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.801 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.801 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.801 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.801 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.801 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.802 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.802 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.802 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.802 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.802 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.803 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.803 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.803 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.803 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.803 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.803 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.804 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.804 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.804 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.804 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.804 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.805 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.805 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.805 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.805 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.805 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.805 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.806 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.806 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.806 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.806 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.806 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.807 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.807 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.807 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.807 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.807 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.808 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.808 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.808 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.808 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.808 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.809 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.809 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.809 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.809 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.809 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.809 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.810 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.810 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.810 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.810 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.811 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.811 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.811 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.811 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.811 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.812 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.812 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.812 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.812 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.812 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.812 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.813 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.813 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.813 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.813 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.813 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.814 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.814 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.814 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.814 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.814 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.815 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.815 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.815 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.815 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.815 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.816 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.816 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.816 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.816 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.816 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.816 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.817 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.817 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.817 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.817 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.817 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.818 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.818 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.818 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.818 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.818 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.819 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.819 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.819 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.819 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.819 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.820 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.820 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.820 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.820 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.820 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.821 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.821 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.821 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.821 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.821 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.822 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.822 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.822 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.822 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.822 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.822 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.823 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.823 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.823 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.823 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.823 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.824 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.824 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.824 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.824 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.824 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.824 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.825 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.825 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.825 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.825 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.825 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.826 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.826 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.826 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.826 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.826 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.827 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.827 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.827 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.827 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.827 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.827 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.828 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.828 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.828 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.828 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.828 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.829 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.829 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.829 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.829 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.829 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.830 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.830 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.830 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.830 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.830 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.831 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.831 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.831 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.831 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.831 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.831 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.832 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.832 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.832 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.832 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.832 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.833 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.833 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.833 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.833 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.833 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.834 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.834 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.834 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.834 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.834 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.834 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.835 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.835 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.835 185114 DEBUG oslo_service.service [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.836 185114 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.853 185114 DEBUG nova.virt.libvirt.host [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.854 185114 DEBUG nova.virt.libvirt.host [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.854 185114 DEBUG nova.virt.libvirt.host [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.854 185114 DEBUG nova.virt.libvirt.host [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 17 17:23:47 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Feb 17 17:23:47 compute-0 systemd[1]: Started libvirt QEMU daemon.
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.917 185114 DEBUG nova.virt.libvirt.host [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fcd31fa1ac0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.920 185114 DEBUG nova.virt.libvirt.host [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fcd31fa1ac0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.920 185114 INFO nova.virt.libvirt.driver [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Connection event '1' reason 'None'
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.957 185114 WARNING nova.virt.libvirt.driver [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Feb 17 17:23:47 compute-0 nova_compute[185110]: 2026-02-17 17:23:47.958 185114 DEBUG nova.virt.libvirt.volume.mount [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 17 17:23:48 compute-0 python3.9[185897]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:23:48 compute-0 nova_compute[185110]: 2026-02-17 17:23:48.696 185114 INFO nova.virt.libvirt.host [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Libvirt host capabilities <capabilities>
Feb 17 17:23:48 compute-0 nova_compute[185110]: 
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <host>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <uuid>a56dc234-e7cf-4362-a05f-7ac9718f0a67</uuid>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <cpu>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <arch>x86_64</arch>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model>EPYC-Rome-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <vendor>AMD</vendor>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <microcode version='16777317'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <signature family='23' model='49' stepping='0'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature name='x2apic'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature name='tsc-deadline'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature name='osxsave'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature name='hypervisor'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature name='tsc_adjust'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature name='spec-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature name='stibp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature name='arch-capabilities'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature name='ssbd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature name='cmp_legacy'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature name='topoext'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature name='virt-ssbd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature name='lbrv'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature name='tsc-scale'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature name='vmcb-clean'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature name='pause-filter'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature name='pfthreshold'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature name='svme-addr-chk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature name='rdctl-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature name='skip-l1dfl-vmentry'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature name='mds-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature name='pschange-mc-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <pages unit='KiB' size='4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <pages unit='KiB' size='2048'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <pages unit='KiB' size='1048576'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </cpu>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <power_management>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <suspend_mem/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <suspend_disk/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <suspend_hybrid/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </power_management>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <iommu support='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <migration_features>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <live/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <uri_transports>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <uri_transport>tcp</uri_transport>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <uri_transport>rdma</uri_transport>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </uri_transports>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </migration_features>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <topology>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <cells num='1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <cell id='0'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:           <memory unit='KiB'>7864284</memory>
Feb 17 17:23:48 compute-0 nova_compute[185110]:           <pages unit='KiB' size='4'>1966071</pages>
Feb 17 17:23:48 compute-0 nova_compute[185110]:           <pages unit='KiB' size='2048'>0</pages>
Feb 17 17:23:48 compute-0 nova_compute[185110]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 17 17:23:48 compute-0 nova_compute[185110]:           <distances>
Feb 17 17:23:48 compute-0 nova_compute[185110]:             <sibling id='0' value='10'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:           </distances>
Feb 17 17:23:48 compute-0 nova_compute[185110]:           <cpus num='8'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:           </cpus>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         </cell>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </cells>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </topology>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <cache>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </cache>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <secmodel>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model>selinux</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <doi>0</doi>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </secmodel>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <secmodel>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model>dac</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <doi>0</doi>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </secmodel>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </host>
Feb 17 17:23:48 compute-0 nova_compute[185110]: 
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <guest>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <os_type>hvm</os_type>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <arch name='i686'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <wordsize>32</wordsize>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <domain type='qemu'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <domain type='kvm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </arch>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <features>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <pae/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <nonpae/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <acpi default='on' toggle='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <apic default='on' toggle='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <cpuselection/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <deviceboot/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <disksnapshot default='on' toggle='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <externalSnapshot/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </features>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </guest>
Feb 17 17:23:48 compute-0 nova_compute[185110]: 
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <guest>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <os_type>hvm</os_type>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <arch name='x86_64'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <wordsize>64</wordsize>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <domain type='qemu'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <domain type='kvm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </arch>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <features>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <acpi default='on' toggle='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <apic default='on' toggle='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <cpuselection/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <deviceboot/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <disksnapshot default='on' toggle='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <externalSnapshot/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </features>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </guest>
Feb 17 17:23:48 compute-0 nova_compute[185110]: 
Feb 17 17:23:48 compute-0 nova_compute[185110]: </capabilities>
Feb 17 17:23:48 compute-0 nova_compute[185110]: 
Feb 17 17:23:48 compute-0 nova_compute[185110]: 2026-02-17 17:23:48.705 185114 DEBUG nova.virt.libvirt.host [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 17 17:23:48 compute-0 nova_compute[185110]: 2026-02-17 17:23:48.718 185114 DEBUG nova.virt.libvirt.host [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 17 17:23:48 compute-0 nova_compute[185110]: <domainCapabilities>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <path>/usr/libexec/qemu-kvm</path>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <domain>kvm</domain>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <arch>i686</arch>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <vcpu max='4096'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <iothreads supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <os supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <enum name='firmware'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <loader supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='type'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>rom</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>pflash</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='readonly'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>yes</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>no</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='secure'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>no</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </loader>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </os>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <cpu>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <mode name='host-passthrough' supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='hostPassthroughMigratable'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>on</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>off</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </mode>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <mode name='maximum' supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='maximumMigratable'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>on</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>off</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </mode>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <mode name='host-model' supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <vendor>AMD</vendor>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='x2apic'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='tsc-deadline'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='hypervisor'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='tsc_adjust'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='spec-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='stibp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='ssbd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='cmp_legacy'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='overflow-recov'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='succor'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='amd-ssbd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='virt-ssbd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='lbrv'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='tsc-scale'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='vmcb-clean'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='flushbyasid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='pause-filter'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='pfthreshold'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='svme-addr-chk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='disable' name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </mode>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <mode name='custom' supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-noTSX'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-v5'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='ClearwaterForest'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ddpd-u'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='intel-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='lam'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sha512'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sm3'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sm4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='ClearwaterForest-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ddpd-u'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='intel-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='lam'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sha512'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sm3'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sm4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cooperlake'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cooperlake-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cooperlake-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Denverton'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mpx'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Denverton-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mpx'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Denverton-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Denverton-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Dhyana-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Genoa'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='auto-ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Genoa-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='auto-ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Genoa-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='auto-ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='perfmon-v2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Milan'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Milan-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Milan-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Milan-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Rome'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Rome-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Rome-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Rome-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Turin'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='auto-ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vp2intersect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibpb-brtype'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='perfmon-v2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbpb'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='srso-user-kernel-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Turin-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='auto-ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vp2intersect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibpb-brtype'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='perfmon-v2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbpb'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='srso-user-kernel-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-v5'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='GraniteRapids'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='GraniteRapids-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='GraniteRapids-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-128'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-256'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-512'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='GraniteRapids-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-128'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-256'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-512'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-noTSX'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-noTSX'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 python3.9[186065]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v5'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v6'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v7'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='IvyBridge'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='IvyBridge-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='IvyBridge-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='IvyBridge-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='KnightsMill'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-4fmaps'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-4vnniw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512er'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512pf'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='KnightsMill-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-4fmaps'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-4vnniw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512er'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512pf'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Opteron_G4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fma4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xop'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Opteron_G4-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fma4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xop'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Opteron_G5'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fma4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tbm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xop'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Opteron_G5-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fma4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tbm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xop'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SapphireRapids'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SapphireRapids-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SapphireRapids-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SapphireRapids-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SapphireRapids-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SierraForest'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SierraForest-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SierraForest-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='intel-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='lam'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SierraForest-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='intel-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='lam'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-v5'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Snowridge'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='core-capability'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mpx'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='split-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Snowridge-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='core-capability'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mpx'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='split-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Snowridge-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='core-capability'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='split-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Snowridge-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='core-capability'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='split-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Snowridge-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='athlon'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnow'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnowext'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='athlon-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnow'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnowext'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='core2duo'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='core2duo-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='coreduo'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='coreduo-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='n270'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='n270-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='phenom'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnow'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnowext'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='phenom-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnow'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnowext'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </mode>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </cpu>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <memoryBacking supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <enum name='sourceType'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <value>file</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <value>anonymous</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <value>memfd</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </memoryBacking>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <devices>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <disk supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='diskDevice'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>disk</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>cdrom</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>floppy</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>lun</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='bus'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>fdc</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>scsi</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>usb</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>sata</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='model'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio-transitional</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio-non-transitional</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </disk>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <graphics supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='type'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vnc</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>egl-headless</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>dbus</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </graphics>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <video supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='modelType'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vga</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>cirrus</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>none</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>bochs</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>ramfb</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </video>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <hostdev supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='mode'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>subsystem</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='startupPolicy'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>default</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>mandatory</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>requisite</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>optional</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='subsysType'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>usb</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>pci</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>scsi</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='capsType'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='pciBackend'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </hostdev>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <rng supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='model'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio-transitional</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio-non-transitional</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='backendModel'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>random</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>egd</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>builtin</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </rng>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <filesystem supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='driverType'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>path</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>handle</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtiofs</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </filesystem>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <tpm supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='model'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>tpm-tis</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>tpm-crb</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='backendModel'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>emulator</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>external</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='backendVersion'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>2.0</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </tpm>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <redirdev supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='bus'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>usb</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </redirdev>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <channel supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='type'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>pty</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>unix</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </channel>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <crypto supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='model'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='type'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>qemu</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='backendModel'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>builtin</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </crypto>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <interface supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='backendType'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>default</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>passt</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </interface>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <panic supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='model'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>isa</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>hyperv</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </panic>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <console supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='type'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>null</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vc</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>pty</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>dev</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>file</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>pipe</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>stdio</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>udp</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>tcp</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>unix</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>qemu-vdagent</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>dbus</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </console>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </devices>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <features>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <gic supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <vmcoreinfo supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <genid supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <backingStoreInput supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <backup supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <async-teardown supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <s390-pv supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <ps2 supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <tdx supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <sev supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <sgx supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <hyperv supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='features'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>relaxed</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vapic</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>spinlocks</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vpindex</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>runtime</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>synic</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>stimer</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>reset</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vendor_id</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>frequencies</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>reenlightenment</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>tlbflush</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>ipi</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>avic</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>emsr_bitmap</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>xmm_input</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <defaults>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <spinlocks>4095</spinlocks>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <stimer_direct>on</stimer_direct>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <tlbflush_direct>on</tlbflush_direct>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <tlbflush_extended>on</tlbflush_extended>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </defaults>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </hyperv>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <launchSecurity supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </features>
Feb 17 17:23:48 compute-0 nova_compute[185110]: </domainCapabilities>
Feb 17 17:23:48 compute-0 nova_compute[185110]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 17 17:23:48 compute-0 nova_compute[185110]: 2026-02-17 17:23:48.724 185114 DEBUG nova.virt.libvirt.host [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 17 17:23:48 compute-0 nova_compute[185110]: <domainCapabilities>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <path>/usr/libexec/qemu-kvm</path>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <domain>kvm</domain>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <arch>i686</arch>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <vcpu max='240'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <iothreads supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <os supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <enum name='firmware'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <loader supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='type'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>rom</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>pflash</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='readonly'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>yes</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>no</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='secure'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>no</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </loader>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </os>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <cpu>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <mode name='host-passthrough' supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='hostPassthroughMigratable'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>on</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>off</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </mode>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <mode name='maximum' supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='maximumMigratable'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>on</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>off</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </mode>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <mode name='host-model' supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <vendor>AMD</vendor>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='x2apic'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='tsc-deadline'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='hypervisor'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='tsc_adjust'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='spec-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='stibp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='ssbd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='cmp_legacy'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='overflow-recov'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='succor'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='amd-ssbd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='virt-ssbd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='lbrv'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='tsc-scale'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='vmcb-clean'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='flushbyasid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='pause-filter'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='pfthreshold'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='svme-addr-chk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='disable' name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </mode>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <mode name='custom' supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-noTSX'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-v5'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='ClearwaterForest'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ddpd-u'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='intel-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='lam'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sha512'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sm3'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sm4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='ClearwaterForest-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ddpd-u'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='intel-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='lam'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sha512'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sm3'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sm4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cooperlake'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cooperlake-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cooperlake-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Denverton'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mpx'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Denverton-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mpx'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Denverton-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Denverton-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Dhyana-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Genoa'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='auto-ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Genoa-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='auto-ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Genoa-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='auto-ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='perfmon-v2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Milan'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Milan-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Milan-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Milan-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Rome'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Rome-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Rome-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Rome-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Turin'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='auto-ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vp2intersect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibpb-brtype'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='perfmon-v2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbpb'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='srso-user-kernel-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Turin-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='auto-ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vp2intersect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibpb-brtype'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='perfmon-v2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbpb'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='srso-user-kernel-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-v5'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='GraniteRapids'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='GraniteRapids-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='GraniteRapids-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-128'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-256'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-512'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='GraniteRapids-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-128'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-256'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-512'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-noTSX'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-noTSX'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v5'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v6'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v7'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='IvyBridge'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='IvyBridge-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='IvyBridge-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='IvyBridge-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='KnightsMill'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-4fmaps'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-4vnniw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512er'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512pf'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='KnightsMill-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-4fmaps'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-4vnniw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512er'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512pf'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Opteron_G4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fma4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xop'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Opteron_G4-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fma4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xop'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Opteron_G5'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fma4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tbm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xop'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Opteron_G5-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fma4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tbm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xop'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SapphireRapids'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SapphireRapids-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SapphireRapids-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SapphireRapids-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SapphireRapids-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SierraForest'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SierraForest-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SierraForest-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='intel-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='lam'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SierraForest-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='intel-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='lam'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-v5'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Snowridge'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='core-capability'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mpx'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='split-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Snowridge-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='core-capability'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mpx'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='split-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Snowridge-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='core-capability'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='split-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Snowridge-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='core-capability'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='split-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Snowridge-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='athlon'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnow'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnowext'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='athlon-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnow'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnowext'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='core2duo'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='core2duo-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='coreduo'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='coreduo-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='n270'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='n270-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='phenom'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnow'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnowext'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='phenom-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnow'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnowext'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </mode>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </cpu>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <memoryBacking supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <enum name='sourceType'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <value>file</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <value>anonymous</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <value>memfd</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </memoryBacking>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <devices>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <disk supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='diskDevice'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>disk</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>cdrom</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>floppy</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>lun</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='bus'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>ide</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>fdc</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>scsi</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>usb</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>sata</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='model'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio-transitional</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio-non-transitional</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </disk>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <graphics supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='type'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vnc</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>egl-headless</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>dbus</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </graphics>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <video supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='modelType'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vga</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>cirrus</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>none</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>bochs</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>ramfb</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </video>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <hostdev supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='mode'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>subsystem</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='startupPolicy'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>default</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>mandatory</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>requisite</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>optional</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='subsysType'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>usb</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>pci</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>scsi</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='capsType'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='pciBackend'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </hostdev>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <rng supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='model'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio-transitional</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio-non-transitional</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='backendModel'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>random</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>egd</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>builtin</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </rng>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <filesystem supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='driverType'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>path</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>handle</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtiofs</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </filesystem>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <tpm supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='model'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>tpm-tis</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>tpm-crb</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='backendModel'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>emulator</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>external</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='backendVersion'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>2.0</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </tpm>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <redirdev supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='bus'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>usb</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </redirdev>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <channel supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='type'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>pty</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>unix</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </channel>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <crypto supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='model'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='type'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>qemu</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='backendModel'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>builtin</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </crypto>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <interface supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='backendType'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>default</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>passt</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </interface>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <panic supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='model'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>isa</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>hyperv</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </panic>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <console supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='type'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>null</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vc</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>pty</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>dev</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>file</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>pipe</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>stdio</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>udp</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>tcp</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>unix</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>qemu-vdagent</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>dbus</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </console>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </devices>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <features>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <gic supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <vmcoreinfo supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <genid supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <backingStoreInput supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <backup supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <async-teardown supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <s390-pv supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <ps2 supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <tdx supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <sev supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <sgx supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <hyperv supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='features'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>relaxed</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vapic</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>spinlocks</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vpindex</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>runtime</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>synic</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>stimer</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>reset</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vendor_id</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>frequencies</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>reenlightenment</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>tlbflush</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>ipi</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>avic</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>emsr_bitmap</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>xmm_input</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <defaults>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <spinlocks>4095</spinlocks>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <stimer_direct>on</stimer_direct>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <tlbflush_direct>on</tlbflush_direct>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <tlbflush_extended>on</tlbflush_extended>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </defaults>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </hyperv>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <launchSecurity supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </features>
Feb 17 17:23:48 compute-0 nova_compute[185110]: </domainCapabilities>
Feb 17 17:23:48 compute-0 nova_compute[185110]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 17 17:23:48 compute-0 nova_compute[185110]: 2026-02-17 17:23:48.774 185114 DEBUG nova.virt.libvirt.host [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 17 17:23:48 compute-0 nova_compute[185110]: 2026-02-17 17:23:48.779 185114 DEBUG nova.virt.libvirt.host [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 17 17:23:48 compute-0 nova_compute[185110]: <domainCapabilities>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <path>/usr/libexec/qemu-kvm</path>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <domain>kvm</domain>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <arch>x86_64</arch>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <vcpu max='4096'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <iothreads supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <os supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <enum name='firmware'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <value>efi</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <loader supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='type'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>rom</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>pflash</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='readonly'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>yes</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>no</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='secure'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>yes</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>no</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </loader>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </os>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <cpu>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <mode name='host-passthrough' supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='hostPassthroughMigratable'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>on</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>off</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </mode>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <mode name='maximum' supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='maximumMigratable'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>on</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>off</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </mode>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <mode name='host-model' supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <vendor>AMD</vendor>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='x2apic'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='tsc-deadline'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='hypervisor'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='tsc_adjust'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='spec-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='stibp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='ssbd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='cmp_legacy'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='overflow-recov'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='succor'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='amd-ssbd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='virt-ssbd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='lbrv'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='tsc-scale'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='vmcb-clean'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='flushbyasid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='pause-filter'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='pfthreshold'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='svme-addr-chk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='disable' name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </mode>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <mode name='custom' supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-noTSX'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-v5'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='ClearwaterForest'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ddpd-u'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='intel-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='lam'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sha512'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sm3'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sm4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='ClearwaterForest-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ddpd-u'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='intel-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='lam'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sha512'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sm3'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sm4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cooperlake'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cooperlake-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cooperlake-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Denverton'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mpx'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Denverton-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mpx'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Denverton-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Denverton-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Dhyana-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Genoa'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='auto-ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Genoa-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='auto-ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Genoa-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='auto-ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='perfmon-v2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Milan'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Milan-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Milan-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Milan-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Rome'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Rome-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Rome-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Rome-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Turin'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='auto-ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vp2intersect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibpb-brtype'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='perfmon-v2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbpb'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='srso-user-kernel-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Turin-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='auto-ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vp2intersect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibpb-brtype'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='perfmon-v2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbpb'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='srso-user-kernel-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-v5'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='GraniteRapids'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='GraniteRapids-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='GraniteRapids-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-128'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-256'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-512'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='GraniteRapids-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-128'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-256'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-512'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-noTSX'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-noTSX'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v5'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v6'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v7'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='IvyBridge'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='IvyBridge-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='IvyBridge-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='IvyBridge-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='KnightsMill'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-4fmaps'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-4vnniw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512er'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512pf'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='KnightsMill-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-4fmaps'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-4vnniw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512er'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512pf'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Opteron_G4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fma4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xop'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Opteron_G4-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fma4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xop'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Opteron_G5'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fma4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tbm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xop'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Opteron_G5-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fma4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tbm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xop'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SapphireRapids'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SapphireRapids-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SapphireRapids-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SapphireRapids-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SapphireRapids-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SierraForest'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SierraForest-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SierraForest-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='intel-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='lam'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SierraForest-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='intel-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='lam'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-v5'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Snowridge'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='core-capability'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mpx'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='split-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Snowridge-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='core-capability'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mpx'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='split-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Snowridge-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='core-capability'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='split-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Snowridge-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='core-capability'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='split-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Snowridge-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='athlon'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnow'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnowext'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='athlon-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnow'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnowext'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='core2duo'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='core2duo-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='coreduo'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='coreduo-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='n270'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='n270-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='phenom'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnow'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnowext'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='phenom-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnow'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnowext'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </mode>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </cpu>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <memoryBacking supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <enum name='sourceType'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <value>file</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <value>anonymous</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <value>memfd</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </memoryBacking>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <devices>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <disk supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='diskDevice'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>disk</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>cdrom</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>floppy</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>lun</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='bus'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>fdc</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>scsi</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>usb</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>sata</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='model'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio-transitional</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio-non-transitional</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </disk>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <graphics supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='type'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vnc</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>egl-headless</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>dbus</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </graphics>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <video supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='modelType'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vga</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>cirrus</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>none</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>bochs</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>ramfb</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </video>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <hostdev supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='mode'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>subsystem</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='startupPolicy'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>default</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>mandatory</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>requisite</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>optional</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='subsysType'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>usb</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>pci</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>scsi</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='capsType'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='pciBackend'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </hostdev>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <rng supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='model'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio-transitional</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio-non-transitional</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='backendModel'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>random</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>egd</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>builtin</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </rng>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <filesystem supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='driverType'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>path</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>handle</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtiofs</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </filesystem>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <tpm supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='model'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>tpm-tis</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>tpm-crb</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='backendModel'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>emulator</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>external</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='backendVersion'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>2.0</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </tpm>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <redirdev supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='bus'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>usb</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </redirdev>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <channel supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='type'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>pty</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>unix</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </channel>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <crypto supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='model'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='type'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>qemu</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='backendModel'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>builtin</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </crypto>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <interface supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='backendType'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>default</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>passt</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </interface>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <panic supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='model'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>isa</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>hyperv</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </panic>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <console supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='type'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>null</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vc</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>pty</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>dev</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>file</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>pipe</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>stdio</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>udp</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>tcp</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>unix</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>qemu-vdagent</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>dbus</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </console>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </devices>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <features>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <gic supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <vmcoreinfo supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <genid supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <backingStoreInput supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <backup supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <async-teardown supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <s390-pv supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <ps2 supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <tdx supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <sev supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <sgx supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <hyperv supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='features'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>relaxed</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vapic</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>spinlocks</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vpindex</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>runtime</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>synic</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>stimer</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>reset</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vendor_id</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>frequencies</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>reenlightenment</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>tlbflush</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>ipi</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>avic</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>emsr_bitmap</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>xmm_input</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <defaults>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <spinlocks>4095</spinlocks>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <stimer_direct>on</stimer_direct>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <tlbflush_direct>on</tlbflush_direct>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <tlbflush_extended>on</tlbflush_extended>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </defaults>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </hyperv>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <launchSecurity supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </features>
Feb 17 17:23:48 compute-0 nova_compute[185110]: </domainCapabilities>
Feb 17 17:23:48 compute-0 nova_compute[185110]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 17 17:23:48 compute-0 nova_compute[185110]: 2026-02-17 17:23:48.846 185114 DEBUG nova.virt.libvirt.host [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 17 17:23:48 compute-0 nova_compute[185110]: <domainCapabilities>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <path>/usr/libexec/qemu-kvm</path>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <domain>kvm</domain>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <arch>x86_64</arch>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <vcpu max='240'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <iothreads supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <os supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <enum name='firmware'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <loader supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='type'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>rom</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>pflash</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='readonly'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>yes</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>no</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='secure'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>no</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </loader>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </os>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <cpu>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <mode name='host-passthrough' supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='hostPassthroughMigratable'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>on</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>off</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </mode>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <mode name='maximum' supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='maximumMigratable'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>on</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>off</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </mode>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <mode name='host-model' supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <vendor>AMD</vendor>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='x2apic'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='tsc-deadline'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='hypervisor'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='tsc_adjust'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='spec-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='stibp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='ssbd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='cmp_legacy'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='overflow-recov'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='succor'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='amd-ssbd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='virt-ssbd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='lbrv'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='tsc-scale'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='vmcb-clean'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='flushbyasid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='pause-filter'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='pfthreshold'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='svme-addr-chk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <feature policy='disable' name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </mode>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <mode name='custom' supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-noTSX'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Broadwell-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cascadelake-Server-v5'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='ClearwaterForest'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ddpd-u'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='intel-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='lam'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sha512'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sm3'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sm4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='ClearwaterForest-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ddpd-u'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='intel-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='lam'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sha512'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sm3'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sm4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cooperlake'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cooperlake-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Cooperlake-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Denverton'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mpx'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Denverton-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mpx'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Denverton-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Denverton-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Dhyana-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Genoa'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='auto-ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Genoa-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='auto-ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Genoa-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='auto-ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='perfmon-v2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Milan'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Milan-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Milan-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Milan-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Rome'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Rome-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Rome-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Rome-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Turin'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='auto-ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vp2intersect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibpb-brtype'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='perfmon-v2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbpb'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='srso-user-kernel-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-Turin-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amd-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='auto-ibrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vp2intersect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibpb-brtype'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='perfmon-v2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbpb'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='srso-user-kernel-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='stibp-always-on'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='EPYC-v5'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='GraniteRapids'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='GraniteRapids-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='GraniteRapids-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-128'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-256'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-512'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='GraniteRapids-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-128'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-256'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx10-512'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='prefetchiti'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-noTSX'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Haswell-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-noTSX'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v5'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v6'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Icelake-Server-v7'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='IvyBridge'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='IvyBridge-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='IvyBridge-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='IvyBridge-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='KnightsMill'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-4fmaps'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-4vnniw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512er'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512pf'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='KnightsMill-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-4fmaps'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-4vnniw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512er'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512pf'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Opteron_G4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fma4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xop'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Opteron_G4-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fma4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xop'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Opteron_G5'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fma4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tbm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xop'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Opteron_G5-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fma4'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tbm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xop'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SapphireRapids'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SapphireRapids-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SapphireRapids-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SapphireRapids-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SapphireRapids-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='amx-tile'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-bf16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-fp16'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bitalg'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrc'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fzrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='la57'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='taa-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SierraForest'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SierraForest-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SierraForest-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='intel-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='lam'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='SierraForest-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ifma'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cmpccxadd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fbsdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='fsrs'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ibrs-all'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='intel-psfd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='lam'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mcdt-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pbrsb-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='psdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='serialize'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vaes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Client-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='hle'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='rtm'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Skylake-Server-v5'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512bw'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512cd'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512dq'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512f'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='avx512vl'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='invpcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pcid'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='pku'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Snowridge'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='core-capability'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mpx'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='split-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Snowridge-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='core-capability'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='mpx'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='split-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Snowridge-v2'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='core-capability'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='split-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Snowridge-v3'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='core-capability'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='split-lock-detect'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='Snowridge-v4'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='cldemote'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='erms'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='gfni'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdir64b'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='movdiri'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='xsaves'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='athlon'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnow'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnowext'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='athlon-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnow'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnowext'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='core2duo'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='core2duo-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='coreduo'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='coreduo-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='n270'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='n270-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='ss'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='phenom'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnow'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnowext'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <blockers model='phenom-v1'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnow'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <feature name='3dnowext'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </blockers>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </mode>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </cpu>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <memoryBacking supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <enum name='sourceType'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <value>file</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <value>anonymous</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <value>memfd</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </memoryBacking>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <devices>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <disk supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='diskDevice'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>disk</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>cdrom</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>floppy</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>lun</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='bus'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>ide</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>fdc</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>scsi</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>usb</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>sata</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='model'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio-transitional</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio-non-transitional</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </disk>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <graphics supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='type'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vnc</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>egl-headless</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>dbus</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </graphics>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <video supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='modelType'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vga</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>cirrus</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>none</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>bochs</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>ramfb</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </video>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <hostdev supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='mode'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>subsystem</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='startupPolicy'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>default</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>mandatory</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>requisite</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>optional</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='subsysType'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>usb</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>pci</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>scsi</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='capsType'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='pciBackend'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </hostdev>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <rng supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='model'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio-transitional</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtio-non-transitional</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='backendModel'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>random</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>egd</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>builtin</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </rng>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <filesystem supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='driverType'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>path</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>handle</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>virtiofs</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </filesystem>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <tpm supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='model'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>tpm-tis</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>tpm-crb</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='backendModel'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>emulator</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>external</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='backendVersion'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>2.0</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </tpm>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <redirdev supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='bus'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>usb</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </redirdev>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <channel supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='type'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>pty</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>unix</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </channel>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <crypto supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='model'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='type'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>qemu</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='backendModel'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>builtin</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </crypto>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <interface supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='backendType'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>default</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>passt</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </interface>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <panic supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='model'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>isa</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>hyperv</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </panic>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <console supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='type'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>null</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vc</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>pty</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>dev</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>file</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>pipe</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>stdio</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>udp</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>tcp</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>unix</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>qemu-vdagent</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>dbus</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </console>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </devices>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   <features>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <gic supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <vmcoreinfo supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <genid supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <backingStoreInput supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <backup supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <async-teardown supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <s390-pv supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <ps2 supported='yes'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <tdx supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <sev supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <sgx supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <hyperv supported='yes'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <enum name='features'>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>relaxed</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vapic</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>spinlocks</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vpindex</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>runtime</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>synic</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>stimer</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>reset</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>vendor_id</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>frequencies</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>reenlightenment</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>tlbflush</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>ipi</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>avic</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>emsr_bitmap</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <value>xmm_input</value>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </enum>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       <defaults>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <spinlocks>4095</spinlocks>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <stimer_direct>on</stimer_direct>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <tlbflush_direct>on</tlbflush_direct>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <tlbflush_extended>on</tlbflush_extended>
Feb 17 17:23:48 compute-0 nova_compute[185110]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 17 17:23:48 compute-0 nova_compute[185110]:       </defaults>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     </hyperv>
Feb 17 17:23:48 compute-0 nova_compute[185110]:     <launchSecurity supported='no'/>
Feb 17 17:23:48 compute-0 nova_compute[185110]:   </features>
Feb 17 17:23:48 compute-0 nova_compute[185110]: </domainCapabilities>
Feb 17 17:23:48 compute-0 nova_compute[185110]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 17 17:23:48 compute-0 nova_compute[185110]: 2026-02-17 17:23:48.912 185114 DEBUG nova.virt.libvirt.host [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 17 17:23:48 compute-0 nova_compute[185110]: 2026-02-17 17:23:48.913 185114 INFO nova.virt.libvirt.host [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Secure Boot support detected
Feb 17 17:23:48 compute-0 nova_compute[185110]: 2026-02-17 17:23:48.914 185114 INFO nova.virt.libvirt.driver [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 17 17:23:48 compute-0 nova_compute[185110]: 2026-02-17 17:23:48.915 185114 INFO nova.virt.libvirt.driver [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 17 17:23:48 compute-0 nova_compute[185110]: 2026-02-17 17:23:48.924 185114 DEBUG nova.virt.libvirt.driver [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 17 17:23:48 compute-0 nova_compute[185110]: 2026-02-17 17:23:48.979 185114 INFO nova.virt.node [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Determined node identity c9b7a021-c13f-4158-9f46-47cefef2fece from /var/lib/nova/compute_id
Feb 17 17:23:48 compute-0 nova_compute[185110]: 2026-02-17 17:23:48.996 185114 WARNING nova.compute.manager [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Compute nodes ['c9b7a021-c13f-4158-9f46-47cefef2fece'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Feb 17 17:23:49 compute-0 nova_compute[185110]: 2026-02-17 17:23:49.026 185114 INFO nova.compute.manager [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 17 17:23:49 compute-0 nova_compute[185110]: 2026-02-17 17:23:49.088 185114 WARNING nova.compute.manager [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Feb 17 17:23:49 compute-0 nova_compute[185110]: 2026-02-17 17:23:49.088 185114 DEBUG oslo_concurrency.lockutils [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:23:49 compute-0 nova_compute[185110]: 2026-02-17 17:23:49.089 185114 DEBUG oslo_concurrency.lockutils [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:23:49 compute-0 nova_compute[185110]: 2026-02-17 17:23:49.089 185114 DEBUG oslo_concurrency.lockutils [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:23:49 compute-0 nova_compute[185110]: 2026-02-17 17:23:49.089 185114 DEBUG nova.compute.resource_tracker [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:23:49 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Feb 17 17:23:49 compute-0 systemd[1]: Started libvirt nodedev daemon.
Feb 17 17:23:49 compute-0 nova_compute[185110]: 2026-02-17 17:23:49.296 185114 WARNING nova.virt.libvirt.driver [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:23:49 compute-0 nova_compute[185110]: 2026-02-17 17:23:49.297 185114 DEBUG nova.compute.resource_tracker [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6139MB free_disk=73.43495178222656GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:23:49 compute-0 nova_compute[185110]: 2026-02-17 17:23:49.297 185114 DEBUG oslo_concurrency.lockutils [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:23:49 compute-0 nova_compute[185110]: 2026-02-17 17:23:49.297 185114 DEBUG oslo_concurrency.lockutils [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:23:49 compute-0 nova_compute[185110]: 2026-02-17 17:23:49.313 185114 WARNING nova.compute.resource_tracker [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] No compute node record for compute-0.ctlplane.example.com:c9b7a021-c13f-4158-9f46-47cefef2fece: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host c9b7a021-c13f-4158-9f46-47cefef2fece could not be found.
Feb 17 17:23:49 compute-0 nova_compute[185110]: 2026-02-17 17:23:49.341 185114 INFO nova.compute.resource_tracker [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: c9b7a021-c13f-4158-9f46-47cefef2fece
Feb 17 17:23:49 compute-0 sudo[186242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wknscyadpihfxznhskkyopgvtdtkuiae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349028.9586434-1384-84711074812374/AnsiballZ_podman_container.py'
Feb 17 17:23:49 compute-0 sudo[186242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:49 compute-0 nova_compute[185110]: 2026-02-17 17:23:49.426 185114 DEBUG nova.compute.resource_tracker [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:23:49 compute-0 nova_compute[185110]: 2026-02-17 17:23:49.426 185114 DEBUG nova.compute.resource_tracker [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:23:49 compute-0 python3.9[186245]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 17 17:23:49 compute-0 sudo[186242]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:49 compute-0 rsyslogd[1015]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 17 17:23:49 compute-0 rsyslogd[1015]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 17 17:23:50 compute-0 sudo[186418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-neefmyskblxovvozphpiuetqkwveqehe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349029.8833923-1392-54205983756438/AnsiballZ_systemd.py'
Feb 17 17:23:50 compute-0 sudo[186418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:50 compute-0 nova_compute[185110]: 2026-02-17 17:23:50.325 185114 INFO nova.scheduler.client.report [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] [req-62b282ac-9817-4c60-8e71-d7d0ed79fd47] Created resource provider record via placement API for resource provider with UUID c9b7a021-c13f-4158-9f46-47cefef2fece and name compute-0.ctlplane.example.com.
Feb 17 17:23:50 compute-0 python3.9[186421]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 17 17:23:50 compute-0 systemd[1]: Stopping nova_compute container...
Feb 17 17:23:50 compute-0 nova_compute[185110]: 2026-02-17 17:23:50.530 185114 DEBUG oslo_concurrency.lockutils [None req-2fb210a9-b3f7-43ae-8039-ed5a5b86f9ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.232s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:23:50 compute-0 nova_compute[185110]: 2026-02-17 17:23:50.530 185114 DEBUG oslo_concurrency.lockutils [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:23:50 compute-0 nova_compute[185110]: 2026-02-17 17:23:50.530 185114 DEBUG oslo_concurrency.lockutils [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:23:50 compute-0 nova_compute[185110]: 2026-02-17 17:23:50.531 185114 DEBUG oslo_concurrency.lockutils [None req-68346e00-72aa-4b8a-869f-1d67fa9e0f30 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:23:50 compute-0 systemd[1]: libpod-94564f8bf7387bfa7fe7952fe8680cdc6df6ed06d10752ff67080cbea2320e42.scope: Deactivated successfully.
Feb 17 17:23:50 compute-0 virtqemud[185833]: libvirt version: 11.10.0, package: 4.el9 (builder@centos.org, 2026-01-29-15:25:17, )
Feb 17 17:23:50 compute-0 virtqemud[185833]: hostname: compute-0
Feb 17 17:23:50 compute-0 virtqemud[185833]: End of file while reading data: Input/output error
Feb 17 17:23:50 compute-0 systemd[1]: libpod-94564f8bf7387bfa7fe7952fe8680cdc6df6ed06d10752ff67080cbea2320e42.scope: Consumed 3.063s CPU time.
Feb 17 17:23:50 compute-0 podman[186425]: 2026-02-17 17:23:50.918336099 +0000 UTC m=+0.434934133 container died 94564f8bf7387bfa7fe7952fe8680cdc6df6ed06d10752ff67080cbea2320e42 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-4e7c7c6595e2dce5f0465dd1d56d23fb04ee5f99dd9a749ec5aad6e79d24726a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute)
Feb 17 17:23:50 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-94564f8bf7387bfa7fe7952fe8680cdc6df6ed06d10752ff67080cbea2320e42-userdata-shm.mount: Deactivated successfully.
Feb 17 17:23:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-56b748ece283e12e48c41fbc5a0c6607a696198f6113e8de25a423de79d6f14b-merged.mount: Deactivated successfully.
Feb 17 17:23:50 compute-0 podman[186425]: 2026-02-17 17:23:50.966583662 +0000 UTC m=+0.483181696 container cleanup 94564f8bf7387bfa7fe7952fe8680cdc6df6ed06d10752ff67080cbea2320e42 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=nova_compute, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-4e7c7c6595e2dce5f0465dd1d56d23fb04ee5f99dd9a749ec5aad6e79d24726a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Feb 17 17:23:50 compute-0 podman[186425]: nova_compute
Feb 17 17:23:51 compute-0 podman[186453]: nova_compute
Feb 17 17:23:51 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Feb 17 17:23:51 compute-0 systemd[1]: Stopped nova_compute container.
Feb 17 17:23:51 compute-0 systemd[1]: Starting nova_compute container...
Feb 17 17:23:51 compute-0 systemd[1]: Started libcrun container.
Feb 17 17:23:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56b748ece283e12e48c41fbc5a0c6607a696198f6113e8de25a423de79d6f14b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 17 17:23:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56b748ece283e12e48c41fbc5a0c6607a696198f6113e8de25a423de79d6f14b/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 17 17:23:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56b748ece283e12e48c41fbc5a0c6607a696198f6113e8de25a423de79d6f14b/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 17 17:23:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56b748ece283e12e48c41fbc5a0c6607a696198f6113e8de25a423de79d6f14b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 17 17:23:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56b748ece283e12e48c41fbc5a0c6607a696198f6113e8de25a423de79d6f14b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 17 17:23:51 compute-0 podman[186464]: 2026-02-17 17:23:51.146590014 +0000 UTC m=+0.092379149 container init 94564f8bf7387bfa7fe7952fe8680cdc6df6ed06d10752ff67080cbea2320e42 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-4e7c7c6595e2dce5f0465dd1d56d23fb04ee5f99dd9a749ec5aad6e79d24726a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 17 17:23:51 compute-0 podman[186464]: 2026-02-17 17:23:51.153352858 +0000 UTC m=+0.099141973 container start 94564f8bf7387bfa7fe7952fe8680cdc6df6ed06d10752ff67080cbea2320e42 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-4e7c7c6595e2dce5f0465dd1d56d23fb04ee5f99dd9a749ec5aad6e79d24726a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 17 17:23:51 compute-0 nova_compute[186479]: + sudo -E kolla_set_configs
Feb 17 17:23:51 compute-0 podman[186464]: nova_compute
Feb 17 17:23:51 compute-0 systemd[1]: Started nova_compute container.
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Validating config file
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Copying service configuration files
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Copying /var/lib/kolla/config_files/src/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Deleting /etc/ceph
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Creating directory /etc/ceph
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Setting permission for /etc/ceph
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 17 17:23:51 compute-0 sudo[186418]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Writing out command to execute
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 17 17:23:51 compute-0 nova_compute[186479]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 17 17:23:51 compute-0 nova_compute[186479]: ++ cat /run_command
Feb 17 17:23:51 compute-0 nova_compute[186479]: + CMD=nova-compute
Feb 17 17:23:51 compute-0 nova_compute[186479]: + ARGS=
Feb 17 17:23:51 compute-0 nova_compute[186479]: + sudo kolla_copy_cacerts
Feb 17 17:23:51 compute-0 nova_compute[186479]: + [[ ! -n '' ]]
Feb 17 17:23:51 compute-0 nova_compute[186479]: + . kolla_extend_start
Feb 17 17:23:51 compute-0 nova_compute[186479]: Running command: 'nova-compute'
Feb 17 17:23:51 compute-0 nova_compute[186479]: + echo 'Running command: '\''nova-compute'\'''
Feb 17 17:23:51 compute-0 nova_compute[186479]: + umask 0022
Feb 17 17:23:51 compute-0 nova_compute[186479]: + exec nova-compute
Feb 17 17:23:51 compute-0 sudo[186640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loyufivptpzapqctfwjhojhmvdnswihh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349031.3897212-1401-205701131029155/AnsiballZ_podman_container.py'
Feb 17 17:23:51 compute-0 sudo[186640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:51 compute-0 python3.9[186643]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 17 17:23:51 compute-0 systemd[1]: Started libpod-conmon-b74bf5dd7c958e11ab63de705be83328e0fb77b84d1b51672159747f0e8a60c6.scope.
Feb 17 17:23:51 compute-0 systemd[1]: Started libcrun container.
Feb 17 17:23:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b2df73c30ae8f4df335718bab1a8df75dbd12199dc507fd279fc746ae2a427f/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Feb 17 17:23:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b2df73c30ae8f4df335718bab1a8df75dbd12199dc507fd279fc746ae2a427f/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 17 17:23:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b2df73c30ae8f4df335718bab1a8df75dbd12199dc507fd279fc746ae2a427f/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Feb 17 17:23:52 compute-0 podman[186669]: 2026-02-17 17:23:52.00833837 +0000 UTC m=+0.119340029 container init b74bf5dd7c958e11ab63de705be83328e0fb77b84d1b51672159747f0e8a60c6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=nova_compute_init, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '4e7c7c6595e2dce5f0465dd1d56d23fb04ee5f99dd9a749ec5aad6e79d24726a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 17 17:23:52 compute-0 podman[186669]: 2026-02-17 17:23:52.013774041 +0000 UTC m=+0.124775670 container start b74bf5dd7c958e11ab63de705be83328e0fb77b84d1b51672159747f0e8a60c6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=nova_compute_init, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '4e7c7c6595e2dce5f0465dd1d56d23fb04ee5f99dd9a749ec5aad6e79d24726a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 17 17:23:52 compute-0 python3.9[186643]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Feb 17 17:23:52 compute-0 nova_compute_init[186692]: INFO:nova_statedir:Applying nova statedir ownership
Feb 17 17:23:52 compute-0 nova_compute_init[186692]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Feb 17 17:23:52 compute-0 nova_compute_init[186692]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Feb 17 17:23:52 compute-0 nova_compute_init[186692]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Feb 17 17:23:52 compute-0 nova_compute_init[186692]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Feb 17 17:23:52 compute-0 nova_compute_init[186692]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Feb 17 17:23:52 compute-0 nova_compute_init[186692]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Feb 17 17:23:52 compute-0 nova_compute_init[186692]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Feb 17 17:23:52 compute-0 nova_compute_init[186692]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Feb 17 17:23:52 compute-0 nova_compute_init[186692]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Feb 17 17:23:52 compute-0 nova_compute_init[186692]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Feb 17 17:23:52 compute-0 nova_compute_init[186692]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Feb 17 17:23:52 compute-0 nova_compute_init[186692]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Feb 17 17:23:52 compute-0 nova_compute_init[186692]: INFO:nova_statedir:Nova statedir ownership complete
Feb 17 17:23:52 compute-0 systemd[1]: libpod-b74bf5dd7c958e11ab63de705be83328e0fb77b84d1b51672159747f0e8a60c6.scope: Deactivated successfully.
Feb 17 17:23:52 compute-0 podman[186705]: 2026-02-17 17:23:52.104695715 +0000 UTC m=+0.025610889 container died b74bf5dd7c958e11ab63de705be83328e0fb77b84d1b51672159747f0e8a60c6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '4e7c7c6595e2dce5f0465dd1d56d23fb04ee5f99dd9a749ec5aad6e79d24726a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=nova_compute_init, org.label-schema.license=GPLv2)
Feb 17 17:23:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b74bf5dd7c958e11ab63de705be83328e0fb77b84d1b51672159747f0e8a60c6-userdata-shm.mount: Deactivated successfully.
Feb 17 17:23:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-7b2df73c30ae8f4df335718bab1a8df75dbd12199dc507fd279fc746ae2a427f-merged.mount: Deactivated successfully.
Feb 17 17:23:52 compute-0 podman[186705]: 2026-02-17 17:23:52.135428536 +0000 UTC m=+0.056343680 container cleanup b74bf5dd7c958e11ab63de705be83328e0fb77b84d1b51672159747f0e8a60c6 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=nova_compute_init, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '4e7c7c6595e2dce5f0465dd1d56d23fb04ee5f99dd9a749ec5aad6e79d24726a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=nova_compute_init, io.buildah.version=1.41.3)
Feb 17 17:23:52 compute-0 sudo[186640]: pam_unix(sudo:session): session closed for user root
Feb 17 17:23:52 compute-0 systemd[1]: libpod-conmon-b74bf5dd7c958e11ab63de705be83328e0fb77b84d1b51672159747f0e8a60c6.scope: Deactivated successfully.
Feb 17 17:23:52 compute-0 sshd-session[161537]: Connection closed by 192.168.122.30 port 54016
Feb 17 17:23:52 compute-0 sshd-session[161534]: pam_unix(sshd:session): session closed for user zuul
Feb 17 17:23:52 compute-0 systemd-logind[806]: Session 23 logged out. Waiting for processes to exit.
Feb 17 17:23:52 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Feb 17 17:23:52 compute-0 systemd[1]: session-23.scope: Consumed 1min 24.307s CPU time.
Feb 17 17:23:52 compute-0 systemd-logind[806]: Removed session 23.
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.143 186483 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.144 186483 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.144 186483 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.144 186483 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.283 186483 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.292 186483 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.293 186483 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.724 186483 INFO nova.virt.driver [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.817 186483 INFO nova.compute.provider_config [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.831 186483 DEBUG oslo_concurrency.lockutils [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.832 186483 DEBUG oslo_concurrency.lockutils [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.832 186483 DEBUG oslo_concurrency.lockutils [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.832 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.833 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.833 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.833 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.833 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.833 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.833 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.834 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.834 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.834 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.834 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.834 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.834 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.834 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.835 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.835 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.835 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.835 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.835 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.835 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.836 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.836 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.836 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.836 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.836 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.837 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.837 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.837 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.837 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.837 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.837 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.838 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.838 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.838 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.838 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.838 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.839 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.839 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.839 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.839 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.839 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.840 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.840 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.840 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.841 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.841 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.841 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.841 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.841 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.841 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.842 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.842 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.842 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.842 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.842 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.843 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.843 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.843 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.843 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.843 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.843 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.843 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.844 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.844 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.844 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.844 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.844 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.844 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.844 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.844 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.845 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.845 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.845 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.845 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.845 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.845 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.845 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.846 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.846 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.846 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.846 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.846 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.846 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.847 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.847 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.847 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.847 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.847 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.847 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.848 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.848 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.848 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.848 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.848 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.848 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.849 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.849 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.849 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.849 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.849 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.849 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.849 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.850 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.850 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.850 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.850 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.850 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.850 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.850 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.851 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.851 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.851 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.851 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.851 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.851 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.852 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.852 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.852 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.852 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.852 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.853 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.853 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.853 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.853 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.853 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.853 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.853 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.854 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.854 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.854 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.854 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.854 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.854 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.854 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.855 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.855 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.855 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.855 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.855 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.855 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.856 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.856 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.856 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.856 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.856 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.856 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.856 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.857 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.857 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.857 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.857 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.857 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.857 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.858 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.858 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.858 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.858 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.858 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.858 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.858 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.859 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.859 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.859 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.859 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.859 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.859 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.859 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.860 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.860 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.860 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.860 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.860 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.861 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.861 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.861 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.861 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.861 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.861 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.862 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.862 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.862 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.862 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.862 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.863 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.863 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.863 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.863 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.863 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.863 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.864 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.864 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.864 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.864 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.864 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.864 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.864 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.865 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.865 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.865 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.865 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.865 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.865 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.865 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.866 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.866 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.866 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.866 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.866 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.866 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.867 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.867 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.867 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.867 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.867 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.867 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.868 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.868 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.868 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.868 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.868 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.868 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.868 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.868 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.869 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.869 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.869 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.869 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.869 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.869 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.869 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.870 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.870 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.870 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.870 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.870 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.870 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.870 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.871 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.871 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.871 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.871 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.871 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.871 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.871 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.872 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.872 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.872 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.872 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.872 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.872 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.872 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.872 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.873 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.873 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.873 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.873 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.873 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.873 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.874 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.874 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.874 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.874 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.874 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.874 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.874 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.875 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.875 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.875 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.875 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.875 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.875 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.875 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.876 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.876 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.876 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.876 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.876 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.876 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.877 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.877 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.877 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.877 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.877 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.877 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.877 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.878 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.878 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.878 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.878 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.878 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.878 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.878 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.878 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.879 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.879 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.879 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.879 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.879 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.879 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.879 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.880 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.880 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.880 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.880 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.880 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.880 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.881 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.881 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.881 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.881 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.881 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.881 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.881 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.882 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.882 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.882 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.882 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.882 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.882 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.882 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.883 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.883 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.883 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.883 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.883 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.883 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.883 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.884 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.884 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.884 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.884 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.884 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.884 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.884 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.885 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.885 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.885 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.885 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.885 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.885 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.885 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.886 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.886 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.886 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.886 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.886 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.886 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.886 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.887 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.887 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.887 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.887 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.887 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.887 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.887 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.888 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.888 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.888 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.888 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.889 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.889 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.889 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.889 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.889 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.889 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.889 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.890 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.890 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.890 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.890 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.890 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.890 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.890 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.891 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.891 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.891 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.891 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.891 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.891 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.891 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.892 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.892 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.892 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.892 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.892 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.892 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.893 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.893 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.893 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.893 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.893 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.893 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.894 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.894 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.894 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.894 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.894 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.895 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.895 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.895 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.895 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.895 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.895 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.895 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.896 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.896 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.896 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.896 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.896 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.896 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.897 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.897 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.897 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.897 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.897 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.897 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.898 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.898 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.898 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.898 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.898 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.898 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.898 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.898 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.899 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.899 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.899 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.899 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.899 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.899 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.899 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.900 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.900 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.900 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.900 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.900 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.900 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.900 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.901 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.901 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.901 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.901 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.901 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.901 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.901 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.901 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.902 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.902 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.902 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.902 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.902 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.902 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.902 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.903 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.903 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.903 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.903 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.903 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.903 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.903 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.903 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.904 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.904 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.904 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.904 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.904 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.904 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.904 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.905 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.905 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.905 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.905 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.905 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.905 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.906 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.906 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.906 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.906 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.906 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.906 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.906 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.906 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.907 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.907 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.907 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.907 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.907 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.907 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.907 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.908 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.908 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.908 186483 WARNING oslo_config.cfg [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 17 17:23:53 compute-0 nova_compute[186479]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 17 17:23:53 compute-0 nova_compute[186479]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 17 17:23:53 compute-0 nova_compute[186479]: and ``live_migration_inbound_addr`` respectively.
Feb 17 17:23:53 compute-0 nova_compute[186479]: ).  Its value may be silently ignored in the future.
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.908 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.908 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.908 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.909 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.909 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.909 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.909 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.909 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.909 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.909 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.910 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.910 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.910 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.910 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.910 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.910 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.910 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.911 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.911 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.911 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.911 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.911 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.911 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.911 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.912 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.912 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.912 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.912 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.912 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.912 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.912 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.913 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.913 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.913 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.913 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.913 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.913 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.913 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.914 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.914 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.914 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.914 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.914 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.914 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.915 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.915 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.915 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.915 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.915 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.915 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.916 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.916 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.916 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.916 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.916 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.916 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.916 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.917 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.917 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.917 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.917 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.917 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.917 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.917 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.918 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.918 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.918 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.918 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.918 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.918 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.918 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.919 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.919 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.919 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.919 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.919 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.919 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.919 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.919 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.920 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.920 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.920 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.920 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.920 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.921 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.921 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.921 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.921 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.921 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.922 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.922 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.922 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.922 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.922 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.922 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.922 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.923 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.923 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.923 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.923 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.923 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.923 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.923 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.924 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.924 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.924 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.924 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.924 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.924 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.924 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.924 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.925 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.925 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.925 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.925 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.925 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.925 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.925 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.926 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.926 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.926 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.926 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.926 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.926 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.926 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.926 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.927 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.927 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.927 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.927 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.927 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.927 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.927 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.928 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.928 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.928 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.928 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.928 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.928 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.929 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.929 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.929 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.929 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.929 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.929 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.929 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.930 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.930 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.930 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.930 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.930 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.930 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.930 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.931 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.931 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.931 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.931 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.931 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.931 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.931 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.932 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.932 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.932 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.932 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.932 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.932 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.932 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.932 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.933 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.933 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.933 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.933 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.933 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.933 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.933 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.934 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.934 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.934 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.934 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.934 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.934 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.935 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.935 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.935 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.935 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.935 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.935 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.935 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.935 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.936 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.936 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.936 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.936 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.936 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.936 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.936 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.937 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.937 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.937 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.937 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.937 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.937 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.937 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.938 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.938 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.938 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.938 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.938 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.938 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.938 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.939 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.939 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.939 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.939 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.939 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.939 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.939 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.940 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.940 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.940 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.940 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.940 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.940 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.940 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.940 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.941 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.941 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.941 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.941 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.941 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.942 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.942 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.942 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.942 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.942 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.942 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.942 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.943 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.943 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.943 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.943 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.943 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.943 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.943 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.944 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.944 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.944 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.944 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.944 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.944 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.945 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.945 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.945 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.945 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.945 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.945 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.945 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.946 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.946 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.946 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.946 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.946 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.946 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.947 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.947 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.947 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.947 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.947 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.947 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.948 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.948 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.948 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.948 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.948 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.948 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.949 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.949 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.949 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.949 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.949 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.950 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.950 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.950 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.950 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.950 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.950 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.951 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.951 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.951 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.951 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.951 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.951 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.951 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.952 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.952 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.952 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.952 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.952 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.952 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.953 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.953 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.953 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.953 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.953 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.953 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.953 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.954 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.954 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.954 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.954 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.954 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.954 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.954 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.954 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.955 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.955 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.955 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.955 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.955 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.955 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.955 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.956 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.956 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.956 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.956 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.956 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.956 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.956 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.957 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.957 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.957 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.957 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.957 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.957 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.957 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.958 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.958 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.958 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.958 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.958 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.958 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.958 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.959 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.959 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.959 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.959 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.959 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.959 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.959 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.959 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.960 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.960 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.960 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.960 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.960 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.960 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.960 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.960 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.961 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.961 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.961 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.961 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.961 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.961 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.961 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.962 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.962 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.962 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.962 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.962 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.962 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.962 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.962 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.963 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.963 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.963 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.963 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.963 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.963 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.963 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.964 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.964 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.964 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.964 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.964 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.964 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.964 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.965 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.965 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.965 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.965 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.965 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.965 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.965 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.966 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.966 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.966 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.966 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.966 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.966 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.966 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.966 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.967 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.967 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.967 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.967 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.967 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.967 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.967 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.968 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.968 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.968 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.968 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.968 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.968 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.968 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.968 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.969 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.969 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.969 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.969 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.969 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.969 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.969 186483 DEBUG oslo_service.service [None req-a3788479-2ab6-458f-9a89-a81a15a45dcc - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.970 186483 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.983 186483 INFO nova.virt.node [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Determined node identity c9b7a021-c13f-4158-9f46-47cefef2fece from /var/lib/nova/compute_id
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.983 186483 DEBUG nova.virt.libvirt.host [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.984 186483 DEBUG nova.virt.libvirt.host [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.984 186483 DEBUG nova.virt.libvirt.host [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.984 186483 DEBUG nova.virt.libvirt.host [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.995 186483 DEBUG nova.virt.libvirt.host [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fcdd7ee0520> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.996 186483 DEBUG nova.virt.libvirt.host [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fcdd7ee0520> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 17 17:23:53 compute-0 nova_compute[186479]: 2026-02-17 17:23:53.997 186483 INFO nova.virt.libvirt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Connection event '1' reason 'None'
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.002 186483 INFO nova.virt.libvirt.host [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Libvirt host capabilities <capabilities>
Feb 17 17:23:54 compute-0 nova_compute[186479]: 
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <host>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <uuid>a56dc234-e7cf-4362-a05f-7ac9718f0a67</uuid>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <cpu>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <arch>x86_64</arch>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model>EPYC-Rome-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <vendor>AMD</vendor>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <microcode version='16777317'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <signature family='23' model='49' stepping='0'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature name='x2apic'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature name='tsc-deadline'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature name='osxsave'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature name='hypervisor'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature name='tsc_adjust'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature name='spec-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature name='stibp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature name='arch-capabilities'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature name='ssbd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature name='cmp_legacy'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature name='topoext'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature name='virt-ssbd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature name='lbrv'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature name='tsc-scale'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature name='vmcb-clean'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature name='pause-filter'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature name='pfthreshold'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature name='svme-addr-chk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature name='rdctl-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature name='skip-l1dfl-vmentry'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature name='mds-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature name='pschange-mc-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <pages unit='KiB' size='4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <pages unit='KiB' size='2048'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <pages unit='KiB' size='1048576'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </cpu>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <power_management>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <suspend_mem/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <suspend_disk/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <suspend_hybrid/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </power_management>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <iommu support='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <migration_features>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <live/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <uri_transports>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <uri_transport>tcp</uri_transport>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <uri_transport>rdma</uri_transport>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </uri_transports>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </migration_features>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <topology>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <cells num='1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <cell id='0'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:           <memory unit='KiB'>7864284</memory>
Feb 17 17:23:54 compute-0 nova_compute[186479]:           <pages unit='KiB' size='4'>1966071</pages>
Feb 17 17:23:54 compute-0 nova_compute[186479]:           <pages unit='KiB' size='2048'>0</pages>
Feb 17 17:23:54 compute-0 nova_compute[186479]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 17 17:23:54 compute-0 nova_compute[186479]:           <distances>
Feb 17 17:23:54 compute-0 nova_compute[186479]:             <sibling id='0' value='10'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:           </distances>
Feb 17 17:23:54 compute-0 nova_compute[186479]:           <cpus num='8'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:           </cpus>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         </cell>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </cells>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </topology>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <cache>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </cache>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <secmodel>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model>selinux</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <doi>0</doi>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </secmodel>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <secmodel>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model>dac</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <doi>0</doi>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </secmodel>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </host>
Feb 17 17:23:54 compute-0 nova_compute[186479]: 
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <guest>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <os_type>hvm</os_type>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <arch name='i686'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <wordsize>32</wordsize>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <domain type='qemu'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <domain type='kvm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </arch>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <features>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <pae/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <nonpae/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <acpi default='on' toggle='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <apic default='on' toggle='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <cpuselection/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <deviceboot/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <disksnapshot default='on' toggle='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <externalSnapshot/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </features>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </guest>
Feb 17 17:23:54 compute-0 nova_compute[186479]: 
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <guest>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <os_type>hvm</os_type>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <arch name='x86_64'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <wordsize>64</wordsize>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <domain type='qemu'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <domain type='kvm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </arch>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <features>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <acpi default='on' toggle='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <apic default='on' toggle='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <cpuselection/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <deviceboot/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <disksnapshot default='on' toggle='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <externalSnapshot/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </features>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </guest>
Feb 17 17:23:54 compute-0 nova_compute[186479]: 
Feb 17 17:23:54 compute-0 nova_compute[186479]: </capabilities>
Feb 17 17:23:54 compute-0 nova_compute[186479]: 
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.007 186483 DEBUG nova.virt.libvirt.host [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.011 186483 DEBUG nova.virt.libvirt.volume.mount [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.012 186483 DEBUG nova.virt.libvirt.host [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 17 17:23:54 compute-0 nova_compute[186479]: <domainCapabilities>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <path>/usr/libexec/qemu-kvm</path>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <domain>kvm</domain>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <arch>i686</arch>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <vcpu max='4096'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <iothreads supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <os supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <enum name='firmware'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <loader supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='type'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>rom</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>pflash</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='readonly'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>yes</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>no</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='secure'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>no</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </loader>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </os>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <cpu>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <mode name='host-passthrough' supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='hostPassthroughMigratable'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>on</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>off</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </mode>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <mode name='maximum' supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='maximumMigratable'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>on</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>off</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </mode>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <mode name='host-model' supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <vendor>AMD</vendor>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='x2apic'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='tsc-deadline'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='hypervisor'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='tsc_adjust'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='spec-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='stibp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='ssbd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='cmp_legacy'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='overflow-recov'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='succor'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='amd-ssbd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='virt-ssbd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='lbrv'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='tsc-scale'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='vmcb-clean'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='flushbyasid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='pause-filter'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='pfthreshold'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='svme-addr-chk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='disable' name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </mode>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <mode name='custom' supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-noTSX'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-v5'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='ClearwaterForest'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ddpd-u'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='intel-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='lam'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sha512'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sm3'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sm4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='ClearwaterForest-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ddpd-u'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='intel-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='lam'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sha512'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sm3'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sm4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cooperlake'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cooperlake-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cooperlake-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Denverton'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mpx'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Denverton-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mpx'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Denverton-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Denverton-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Dhyana-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Genoa'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='auto-ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Genoa-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='auto-ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Genoa-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='auto-ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='perfmon-v2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Milan'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Milan-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Milan-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Milan-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Rome'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Rome-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Rome-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Rome-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Turin'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='auto-ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vp2intersect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibpb-brtype'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='perfmon-v2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbpb'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='srso-user-kernel-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Turin-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='auto-ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vp2intersect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibpb-brtype'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='perfmon-v2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbpb'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='srso-user-kernel-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-v5'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='GraniteRapids'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='GraniteRapids-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='GraniteRapids-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-128'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-256'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-512'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='GraniteRapids-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-128'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-256'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-512'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-noTSX'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-noTSX'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v5'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v6'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v7'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='IvyBridge'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='IvyBridge-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='IvyBridge-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='IvyBridge-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='KnightsMill'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-4fmaps'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-4vnniw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512er'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512pf'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='KnightsMill-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-4fmaps'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-4vnniw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512er'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512pf'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Opteron_G4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fma4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xop'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Opteron_G4-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fma4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xop'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Opteron_G5'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fma4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tbm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xop'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Opteron_G5-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fma4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tbm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xop'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SapphireRapids'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SapphireRapids-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SapphireRapids-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SapphireRapids-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SapphireRapids-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SierraForest'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SierraForest-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SierraForest-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='intel-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='lam'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SierraForest-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='intel-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='lam'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-v5'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Snowridge'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='core-capability'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mpx'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='split-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Snowridge-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='core-capability'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mpx'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='split-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Snowridge-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='core-capability'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='split-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Snowridge-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='core-capability'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='split-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Snowridge-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='athlon'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnow'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnowext'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='athlon-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnow'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnowext'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='core2duo'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='core2duo-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='coreduo'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='coreduo-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='n270'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='n270-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='phenom'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnow'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnowext'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='phenom-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnow'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnowext'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </mode>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <memoryBacking supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <enum name='sourceType'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <value>file</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <value>anonymous</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <value>memfd</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </memoryBacking>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <disk supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='diskDevice'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>disk</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>cdrom</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>floppy</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>lun</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='bus'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>fdc</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>scsi</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>usb</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>sata</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='model'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio-transitional</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio-non-transitional</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <graphics supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='type'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vnc</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>egl-headless</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>dbus</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </graphics>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <video supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='modelType'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vga</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>cirrus</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>none</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>bochs</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>ramfb</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </video>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <hostdev supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='mode'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>subsystem</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='startupPolicy'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>default</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>mandatory</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>requisite</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>optional</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='subsysType'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>usb</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>pci</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>scsi</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='capsType'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='pciBackend'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </hostdev>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <rng supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='model'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio-transitional</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio-non-transitional</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='backendModel'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>random</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>egd</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>builtin</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <filesystem supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='driverType'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>path</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>handle</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtiofs</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </filesystem>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <tpm supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='model'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>tpm-tis</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>tpm-crb</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='backendModel'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>emulator</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>external</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='backendVersion'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>2.0</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </tpm>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <redirdev supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='bus'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>usb</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </redirdev>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <channel supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='type'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>pty</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>unix</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </channel>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <crypto supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='model'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='type'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>qemu</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='backendModel'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>builtin</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </crypto>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <interface supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='backendType'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>default</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>passt</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <panic supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='model'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>isa</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>hyperv</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </panic>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <console supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='type'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>null</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vc</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>pty</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>dev</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>file</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>pipe</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>stdio</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>udp</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>tcp</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>unix</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>qemu-vdagent</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>dbus</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </console>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <features>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <gic supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <vmcoreinfo supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <genid supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <backingStoreInput supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <backup supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <async-teardown supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <s390-pv supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <ps2 supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <tdx supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <sev supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <sgx supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <hyperv supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='features'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>relaxed</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vapic</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>spinlocks</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vpindex</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>runtime</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>synic</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>stimer</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>reset</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vendor_id</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>frequencies</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>reenlightenment</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>tlbflush</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>ipi</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>avic</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>emsr_bitmap</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>xmm_input</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <defaults>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <spinlocks>4095</spinlocks>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <stimer_direct>on</stimer_direct>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <tlbflush_direct>on</tlbflush_direct>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <tlbflush_extended>on</tlbflush_extended>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </defaults>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </hyperv>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <launchSecurity supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </features>
Feb 17 17:23:54 compute-0 nova_compute[186479]: </domainCapabilities>
Feb 17 17:23:54 compute-0 nova_compute[186479]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.019 186483 DEBUG nova.virt.libvirt.host [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 17 17:23:54 compute-0 nova_compute[186479]: <domainCapabilities>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <path>/usr/libexec/qemu-kvm</path>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <domain>kvm</domain>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <arch>i686</arch>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <vcpu max='240'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <iothreads supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <os supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <enum name='firmware'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <loader supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='type'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>rom</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>pflash</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='readonly'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>yes</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>no</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='secure'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>no</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </loader>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </os>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <cpu>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <mode name='host-passthrough' supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='hostPassthroughMigratable'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>on</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>off</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </mode>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <mode name='maximum' supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='maximumMigratable'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>on</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>off</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </mode>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <mode name='host-model' supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <vendor>AMD</vendor>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='x2apic'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='tsc-deadline'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='hypervisor'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='tsc_adjust'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='spec-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='stibp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='ssbd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='cmp_legacy'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='overflow-recov'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='succor'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='amd-ssbd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='virt-ssbd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='lbrv'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='tsc-scale'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='vmcb-clean'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='flushbyasid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='pause-filter'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='pfthreshold'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='svme-addr-chk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='disable' name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </mode>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <mode name='custom' supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-noTSX'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-v5'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='ClearwaterForest'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ddpd-u'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='intel-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='lam'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sha512'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sm3'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sm4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='ClearwaterForest-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ddpd-u'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='intel-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='lam'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sha512'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sm3'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sm4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cooperlake'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cooperlake-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cooperlake-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Denverton'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mpx'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Denverton-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mpx'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Denverton-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Denverton-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Dhyana-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Genoa'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='auto-ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Genoa-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='auto-ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Genoa-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='auto-ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='perfmon-v2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Milan'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Milan-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Milan-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Milan-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Rome'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Rome-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Rome-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Rome-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Turin'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='auto-ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vp2intersect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibpb-brtype'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='perfmon-v2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbpb'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='srso-user-kernel-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Turin-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='auto-ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vp2intersect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibpb-brtype'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='perfmon-v2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbpb'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='srso-user-kernel-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-v5'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='GraniteRapids'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='GraniteRapids-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='GraniteRapids-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-128'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-256'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-512'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='GraniteRapids-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-128'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-256'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-512'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-noTSX'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-noTSX'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v5'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v6'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v7'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='IvyBridge'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='IvyBridge-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='IvyBridge-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='IvyBridge-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='KnightsMill'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-4fmaps'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-4vnniw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512er'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512pf'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='KnightsMill-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-4fmaps'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-4vnniw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512er'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512pf'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Opteron_G4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fma4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xop'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Opteron_G4-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fma4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xop'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Opteron_G5'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fma4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tbm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xop'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Opteron_G5-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fma4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tbm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xop'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SapphireRapids'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SapphireRapids-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SapphireRapids-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SapphireRapids-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SapphireRapids-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SierraForest'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SierraForest-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SierraForest-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='intel-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='lam'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SierraForest-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='intel-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='lam'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-v5'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Snowridge'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='core-capability'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mpx'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='split-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Snowridge-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='core-capability'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mpx'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='split-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Snowridge-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='core-capability'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='split-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Snowridge-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='core-capability'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='split-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Snowridge-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='athlon'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnow'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnowext'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='athlon-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnow'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnowext'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='core2duo'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='core2duo-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='coreduo'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='coreduo-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='n270'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='n270-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='phenom'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnow'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnowext'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='phenom-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnow'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnowext'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </mode>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <memoryBacking supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <enum name='sourceType'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <value>file</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <value>anonymous</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <value>memfd</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </memoryBacking>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <disk supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='diskDevice'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>disk</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>cdrom</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>floppy</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>lun</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='bus'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>ide</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>fdc</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>scsi</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>usb</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>sata</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='model'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio-transitional</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio-non-transitional</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <graphics supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='type'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vnc</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>egl-headless</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>dbus</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </graphics>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <video supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='modelType'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vga</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>cirrus</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>none</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>bochs</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>ramfb</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </video>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <hostdev supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='mode'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>subsystem</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='startupPolicy'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>default</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>mandatory</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>requisite</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>optional</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='subsysType'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>usb</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>pci</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>scsi</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='capsType'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='pciBackend'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </hostdev>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <rng supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='model'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio-transitional</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio-non-transitional</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='backendModel'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>random</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>egd</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>builtin</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <filesystem supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='driverType'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>path</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>handle</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtiofs</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </filesystem>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <tpm supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='model'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>tpm-tis</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>tpm-crb</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='backendModel'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>emulator</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>external</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='backendVersion'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>2.0</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </tpm>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <redirdev supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='bus'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>usb</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </redirdev>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <channel supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='type'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>pty</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>unix</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </channel>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <crypto supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='model'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='type'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>qemu</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='backendModel'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>builtin</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </crypto>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <interface supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='backendType'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>default</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>passt</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <panic supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='model'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>isa</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>hyperv</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </panic>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <console supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='type'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>null</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vc</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>pty</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>dev</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>file</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>pipe</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>stdio</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>udp</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>tcp</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>unix</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>qemu-vdagent</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>dbus</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </console>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <features>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <gic supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <vmcoreinfo supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <genid supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <backingStoreInput supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <backup supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <async-teardown supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <s390-pv supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <ps2 supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <tdx supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <sev supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <sgx supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <hyperv supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='features'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>relaxed</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vapic</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>spinlocks</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vpindex</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>runtime</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>synic</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>stimer</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>reset</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vendor_id</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>frequencies</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>reenlightenment</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>tlbflush</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>ipi</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>avic</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>emsr_bitmap</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>xmm_input</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <defaults>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <spinlocks>4095</spinlocks>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <stimer_direct>on</stimer_direct>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <tlbflush_direct>on</tlbflush_direct>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <tlbflush_extended>on</tlbflush_extended>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </defaults>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </hyperv>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <launchSecurity supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </features>
Feb 17 17:23:54 compute-0 nova_compute[186479]: </domainCapabilities>
Feb 17 17:23:54 compute-0 nova_compute[186479]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.087 186483 DEBUG nova.virt.libvirt.host [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.092 186483 DEBUG nova.virt.libvirt.host [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 17 17:23:54 compute-0 nova_compute[186479]: <domainCapabilities>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <path>/usr/libexec/qemu-kvm</path>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <domain>kvm</domain>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <arch>x86_64</arch>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <vcpu max='4096'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <iothreads supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <os supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <enum name='firmware'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <value>efi</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <loader supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='type'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>rom</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>pflash</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='readonly'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>yes</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>no</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='secure'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>yes</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>no</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </loader>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </os>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <cpu>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <mode name='host-passthrough' supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='hostPassthroughMigratable'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>on</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>off</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </mode>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <mode name='maximum' supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='maximumMigratable'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>on</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>off</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </mode>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <mode name='host-model' supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <vendor>AMD</vendor>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='x2apic'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='tsc-deadline'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='hypervisor'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='tsc_adjust'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='spec-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='stibp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='ssbd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='cmp_legacy'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='overflow-recov'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='succor'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='amd-ssbd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='virt-ssbd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='lbrv'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='tsc-scale'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='vmcb-clean'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='flushbyasid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='pause-filter'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='pfthreshold'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='svme-addr-chk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='disable' name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </mode>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <mode name='custom' supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-noTSX'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-v5'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='ClearwaterForest'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ddpd-u'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='intel-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='lam'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sha512'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sm3'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sm4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='ClearwaterForest-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ddpd-u'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='intel-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='lam'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sha512'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sm3'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sm4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cooperlake'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cooperlake-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cooperlake-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Denverton'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mpx'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Denverton-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mpx'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Denverton-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Denverton-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Dhyana-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Genoa'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='auto-ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Genoa-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='auto-ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Genoa-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='auto-ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='perfmon-v2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Milan'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Milan-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Milan-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Milan-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Rome'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Rome-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Rome-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Rome-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Turin'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='auto-ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vp2intersect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibpb-brtype'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='perfmon-v2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbpb'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='srso-user-kernel-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Turin-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='auto-ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vp2intersect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibpb-brtype'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='perfmon-v2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbpb'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='srso-user-kernel-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-v5'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='GraniteRapids'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='GraniteRapids-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='GraniteRapids-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-128'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-256'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-512'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='GraniteRapids-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-128'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-256'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-512'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-noTSX'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-noTSX'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v5'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v6'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v7'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='IvyBridge'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='IvyBridge-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='IvyBridge-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='IvyBridge-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='KnightsMill'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-4fmaps'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-4vnniw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512er'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512pf'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='KnightsMill-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-4fmaps'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-4vnniw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512er'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512pf'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Opteron_G4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fma4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xop'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Opteron_G4-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fma4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xop'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Opteron_G5'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fma4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tbm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xop'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Opteron_G5-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fma4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tbm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xop'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SapphireRapids'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SapphireRapids-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SapphireRapids-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SapphireRapids-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SapphireRapids-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SierraForest'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SierraForest-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SierraForest-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='intel-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='lam'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SierraForest-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='intel-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='lam'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-v5'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Snowridge'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='core-capability'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mpx'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='split-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Snowridge-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='core-capability'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mpx'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='split-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Snowridge-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='core-capability'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='split-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Snowridge-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='core-capability'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='split-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Snowridge-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='athlon'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnow'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnowext'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='athlon-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnow'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnowext'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='core2duo'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='core2duo-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='coreduo'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='coreduo-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='n270'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='n270-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='phenom'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnow'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnowext'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='phenom-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnow'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnowext'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </mode>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <memoryBacking supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <enum name='sourceType'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <value>file</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <value>anonymous</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <value>memfd</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </memoryBacking>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <disk supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='diskDevice'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>disk</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>cdrom</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>floppy</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>lun</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='bus'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>fdc</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>scsi</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>usb</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>sata</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='model'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio-transitional</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio-non-transitional</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <graphics supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='type'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vnc</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>egl-headless</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>dbus</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </graphics>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <video supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='modelType'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vga</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>cirrus</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>none</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>bochs</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>ramfb</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </video>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <hostdev supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='mode'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>subsystem</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='startupPolicy'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>default</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>mandatory</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>requisite</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>optional</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='subsysType'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>usb</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>pci</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>scsi</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='capsType'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='pciBackend'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </hostdev>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <rng supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='model'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio-transitional</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio-non-transitional</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='backendModel'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>random</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>egd</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>builtin</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <filesystem supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='driverType'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>path</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>handle</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtiofs</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </filesystem>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <tpm supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='model'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>tpm-tis</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>tpm-crb</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='backendModel'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>emulator</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>external</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='backendVersion'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>2.0</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </tpm>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <redirdev supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='bus'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>usb</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </redirdev>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <channel supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='type'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>pty</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>unix</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </channel>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <crypto supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='model'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='type'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>qemu</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='backendModel'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>builtin</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </crypto>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <interface supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='backendType'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>default</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>passt</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <panic supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='model'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>isa</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>hyperv</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </panic>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <console supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='type'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>null</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vc</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>pty</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>dev</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>file</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>pipe</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>stdio</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>udp</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>tcp</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>unix</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>qemu-vdagent</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>dbus</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </console>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <features>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <gic supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <vmcoreinfo supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <genid supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <backingStoreInput supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <backup supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <async-teardown supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <s390-pv supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <ps2 supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <tdx supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <sev supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <sgx supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <hyperv supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='features'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>relaxed</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vapic</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>spinlocks</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vpindex</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>runtime</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>synic</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>stimer</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>reset</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vendor_id</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>frequencies</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>reenlightenment</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>tlbflush</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>ipi</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>avic</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>emsr_bitmap</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>xmm_input</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <defaults>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <spinlocks>4095</spinlocks>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <stimer_direct>on</stimer_direct>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <tlbflush_direct>on</tlbflush_direct>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <tlbflush_extended>on</tlbflush_extended>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </defaults>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </hyperv>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <launchSecurity supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </features>
Feb 17 17:23:54 compute-0 nova_compute[186479]: </domainCapabilities>
Feb 17 17:23:54 compute-0 nova_compute[186479]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.185 186483 DEBUG nova.virt.libvirt.host [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 17 17:23:54 compute-0 nova_compute[186479]: <domainCapabilities>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <path>/usr/libexec/qemu-kvm</path>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <domain>kvm</domain>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <arch>x86_64</arch>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <vcpu max='240'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <iothreads supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <os supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <enum name='firmware'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <loader supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='type'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>rom</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>pflash</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='readonly'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>yes</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>no</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='secure'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>no</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </loader>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </os>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <cpu>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <mode name='host-passthrough' supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='hostPassthroughMigratable'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>on</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>off</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </mode>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <mode name='maximum' supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='maximumMigratable'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>on</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>off</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </mode>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <mode name='host-model' supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <vendor>AMD</vendor>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='x2apic'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='tsc-deadline'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='hypervisor'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='tsc_adjust'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='spec-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='stibp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='ssbd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='cmp_legacy'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='overflow-recov'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='succor'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='amd-ssbd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='virt-ssbd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='lbrv'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='tsc-scale'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='vmcb-clean'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='flushbyasid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='pause-filter'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='pfthreshold'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='svme-addr-chk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <feature policy='disable' name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </mode>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <mode name='custom' supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-noTSX'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Broadwell-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cascadelake-Server-v5'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='ClearwaterForest'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ddpd-u'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='intel-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='lam'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sha512'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sm3'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sm4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='ClearwaterForest-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ddpd-u'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='intel-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='lam'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sha512'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sm3'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sm4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cooperlake'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cooperlake-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Cooperlake-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Denverton'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mpx'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Denverton-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mpx'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Denverton-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Denverton-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Dhyana-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Genoa'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='auto-ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Genoa-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='auto-ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Genoa-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='auto-ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='perfmon-v2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Milan'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Milan-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Milan-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Milan-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Rome'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Rome-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Rome-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Rome-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Turin'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='auto-ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vp2intersect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibpb-brtype'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='perfmon-v2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbpb'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='srso-user-kernel-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-Turin-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amd-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='auto-ibrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vp2intersect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fs-gs-base-ns'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibpb-brtype'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='no-nested-data-bp'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='null-sel-clr-base'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='perfmon-v2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbpb'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='srso-user-kernel-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='stibp-always-on'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='EPYC-v5'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='GraniteRapids'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='GraniteRapids-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='GraniteRapids-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-128'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-256'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-512'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='GraniteRapids-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-128'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-256'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx10-512'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='prefetchiti'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-noTSX'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Haswell-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-noTSX'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v5'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v6'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Icelake-Server-v7'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='IvyBridge'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='IvyBridge-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='IvyBridge-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='IvyBridge-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='KnightsMill'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-4fmaps'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-4vnniw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512er'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512pf'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='KnightsMill-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-4fmaps'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-4vnniw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512er'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512pf'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Opteron_G4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fma4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xop'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Opteron_G4-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fma4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xop'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Opteron_G5'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fma4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tbm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xop'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Opteron_G5-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fma4'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tbm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xop'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SapphireRapids'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SapphireRapids-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SapphireRapids-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SapphireRapids-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SapphireRapids-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='amx-tile'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-bf16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-fp16'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512-vpopcntdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bitalg'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vbmi2'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrc'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fzrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='la57'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='taa-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='tsx-ldtrk'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SierraForest'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SierraForest-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SierraForest-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='intel-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='lam'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='SierraForest-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ifma'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-ne-convert'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx-vnni-int8'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bhi-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='bus-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cmpccxadd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fbsdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='fsrs'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ibrs-all'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='intel-psfd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ipred-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='lam'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mcdt-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pbrsb-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='psdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rrsba-ctrl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='sbdr-ssdp-no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='serialize'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vaes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='vpclmulqdq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Client-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='hle'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='rtm'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Skylake-Server-v5'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512bw'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512cd'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512dq'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512f'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='avx512vl'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='invpcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pcid'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='pku'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Snowridge'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='core-capability'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mpx'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='split-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Snowridge-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='core-capability'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='mpx'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='split-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Snowridge-v2'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='core-capability'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='split-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Snowridge-v3'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='core-capability'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='split-lock-detect'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='Snowridge-v4'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='cldemote'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='erms'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='gfni'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdir64b'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='movdiri'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='xsaves'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='athlon'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnow'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnowext'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='athlon-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnow'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnowext'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='core2duo'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='core2duo-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='coreduo'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='coreduo-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='n270'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='n270-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='ss'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='phenom'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnow'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnowext'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <blockers model='phenom-v1'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnow'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <feature name='3dnowext'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </blockers>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </mode>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <memoryBacking supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <enum name='sourceType'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <value>file</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <value>anonymous</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <value>memfd</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </memoryBacking>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <disk supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='diskDevice'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>disk</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>cdrom</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>floppy</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>lun</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='bus'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>ide</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>fdc</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>scsi</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>usb</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>sata</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='model'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio-transitional</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio-non-transitional</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <graphics supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='type'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vnc</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>egl-headless</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>dbus</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </graphics>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <video supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='modelType'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vga</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>cirrus</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>none</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>bochs</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>ramfb</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </video>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <hostdev supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='mode'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>subsystem</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='startupPolicy'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>default</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>mandatory</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>requisite</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>optional</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='subsysType'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>usb</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>pci</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>scsi</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='capsType'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='pciBackend'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </hostdev>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <rng supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='model'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio-transitional</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtio-non-transitional</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='backendModel'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>random</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>egd</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>builtin</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <filesystem supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='driverType'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>path</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>handle</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>virtiofs</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </filesystem>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <tpm supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='model'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>tpm-tis</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>tpm-crb</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='backendModel'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>emulator</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>external</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='backendVersion'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>2.0</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </tpm>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <redirdev supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='bus'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>usb</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </redirdev>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <channel supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='type'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>pty</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>unix</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </channel>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <crypto supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='model'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='type'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>qemu</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='backendModel'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>builtin</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </crypto>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <interface supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='backendType'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>default</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>passt</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <panic supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='model'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>isa</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>hyperv</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </panic>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <console supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='type'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>null</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vc</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>pty</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>dev</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>file</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>pipe</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>stdio</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>udp</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>tcp</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>unix</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>qemu-vdagent</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>dbus</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </console>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   <features>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <gic supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <vmcoreinfo supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <genid supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <backingStoreInput supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <backup supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <async-teardown supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <s390-pv supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <ps2 supported='yes'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <tdx supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <sev supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <sgx supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <hyperv supported='yes'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <enum name='features'>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>relaxed</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vapic</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>spinlocks</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vpindex</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>runtime</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>synic</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>stimer</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>reset</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>vendor_id</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>frequencies</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>reenlightenment</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>tlbflush</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>ipi</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>avic</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>emsr_bitmap</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <value>xmm_input</value>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </enum>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       <defaults>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <spinlocks>4095</spinlocks>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <stimer_direct>on</stimer_direct>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <tlbflush_direct>on</tlbflush_direct>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <tlbflush_extended>on</tlbflush_extended>
Feb 17 17:23:54 compute-0 nova_compute[186479]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 17 17:23:54 compute-0 nova_compute[186479]:       </defaults>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     </hyperv>
Feb 17 17:23:54 compute-0 nova_compute[186479]:     <launchSecurity supported='no'/>
Feb 17 17:23:54 compute-0 nova_compute[186479]:   </features>
Feb 17 17:23:54 compute-0 nova_compute[186479]: </domainCapabilities>
Feb 17 17:23:54 compute-0 nova_compute[186479]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.277 186483 DEBUG nova.virt.libvirt.host [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.277 186483 INFO nova.virt.libvirt.host [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Secure Boot support detected
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.279 186483 INFO nova.virt.libvirt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.279 186483 INFO nova.virt.libvirt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.288 186483 DEBUG nova.virt.libvirt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.315 186483 INFO nova.virt.node [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Determined node identity c9b7a021-c13f-4158-9f46-47cefef2fece from /var/lib/nova/compute_id
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.332 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Verified node c9b7a021-c13f-4158-9f46-47cefef2fece matches my host compute-0.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.353 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.429 186483 DEBUG oslo_concurrency.lockutils [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.430 186483 DEBUG oslo_concurrency.lockutils [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.430 186483 DEBUG oslo_concurrency.lockutils [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.430 186483 DEBUG nova.compute.resource_tracker [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.553 186483 WARNING nova.virt.libvirt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.554 186483 DEBUG nova.compute.resource_tracker [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6100MB free_disk=73.4334602355957GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.554 186483 DEBUG oslo_concurrency.lockutils [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.555 186483 DEBUG oslo_concurrency.lockutils [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.706 186483 DEBUG nova.compute.resource_tracker [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.706 186483 DEBUG nova.compute.resource_tracker [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:23:54 compute-0 podman[186779]: 2026-02-17 17:23:54.743766982 +0000 UTC m=+0.084102210 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.767 186483 DEBUG nova.scheduler.client.report [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Refreshing inventories for resource provider c9b7a021-c13f-4158-9f46-47cefef2fece _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.785 186483 DEBUG nova.scheduler.client.report [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Updating ProviderTree inventory for provider c9b7a021-c13f-4158-9f46-47cefef2fece from _refresh_and_get_inventory using data: {} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.786 186483 DEBUG nova.compute.provider_tree [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.802 186483 DEBUG nova.scheduler.client.report [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Refreshing aggregate associations for resource provider c9b7a021-c13f-4158-9f46-47cefef2fece, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.818 186483 DEBUG nova.scheduler.client.report [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Refreshing trait associations for resource provider c9b7a021-c13f-4158-9f46-47cefef2fece, traits: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.838 186483 DEBUG nova.virt.libvirt.host [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 17 17:23:54 compute-0 nova_compute[186479]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.838 186483 INFO nova.virt.libvirt.host [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] kernel doesn't support AMD SEV
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.839 186483 DEBUG nova.compute.provider_tree [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Updating inventory in ProviderTree for provider c9b7a021-c13f-4158-9f46-47cefef2fece with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.839 186483 DEBUG nova.virt.libvirt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.888 186483 DEBUG nova.scheduler.client.report [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Updated inventory for provider c9b7a021-c13f-4158-9f46-47cefef2fece with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.889 186483 DEBUG nova.compute.provider_tree [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Updating resource provider c9b7a021-c13f-4158-9f46-47cefef2fece generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.889 186483 DEBUG nova.compute.provider_tree [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Updating inventory in ProviderTree for provider c9b7a021-c13f-4158-9f46-47cefef2fece with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 17 17:23:54 compute-0 nova_compute[186479]: 2026-02-17 17:23:54.980 186483 DEBUG nova.compute.provider_tree [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Updating resource provider c9b7a021-c13f-4158-9f46-47cefef2fece generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 17 17:23:55 compute-0 nova_compute[186479]: 2026-02-17 17:23:55.009 186483 DEBUG nova.compute.resource_tracker [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:23:55 compute-0 nova_compute[186479]: 2026-02-17 17:23:55.009 186483 DEBUG oslo_concurrency.lockutils [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.454s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:23:55 compute-0 nova_compute[186479]: 2026-02-17 17:23:55.009 186483 DEBUG nova.service [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 17 17:23:55 compute-0 nova_compute[186479]: 2026-02-17 17:23:55.095 186483 DEBUG nova.service [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 17 17:23:55 compute-0 nova_compute[186479]: 2026-02-17 17:23:55.095 186483 DEBUG nova.servicegroup.drivers.db [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 17 17:23:57 compute-0 sshd-session[186805]: Accepted publickey for zuul from 192.168.122.30 port 59420 ssh2: ECDSA SHA256:+U7vSwQZvFvLlKGEfHWMlLBUV1LwgLTIxJVbiTi7h3A
Feb 17 17:23:57 compute-0 systemd-logind[806]: New session 25 of user zuul.
Feb 17 17:23:57 compute-0 systemd[1]: Started Session 25 of User zuul.
Feb 17 17:23:57 compute-0 sshd-session[186805]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 17:23:58 compute-0 python3.9[186958]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 17 17:23:59 compute-0 sudo[187112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlaudglpwyyktbhsuppbwjvogghkrqpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349039.0256681-31-104798698818159/AnsiballZ_systemd_service.py'
Feb 17 17:23:59 compute-0 sudo[187112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:23:59 compute-0 python3.9[187115]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 17 17:23:59 compute-0 systemd[1]: Reloading.
Feb 17 17:23:59 compute-0 systemd-rc-local-generator[187141]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:23:59 compute-0 systemd-sysv-generator[187146]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:24:00 compute-0 sudo[187112]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:01 compute-0 python3.9[187306]: ansible-ansible.builtin.service_facts Invoked
Feb 17 17:24:01 compute-0 network[187323]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 17 17:24:01 compute-0 network[187324]: 'network-scripts' will be removed from distribution in near future.
Feb 17 17:24:01 compute-0 network[187325]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 17 17:24:04 compute-0 sudo[187596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecglbjistjvrbvqhtrjvcrffyitrkrdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349044.5327902-50-248254463253079/AnsiballZ_systemd_service.py'
Feb 17 17:24:04 compute-0 sudo[187596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:05 compute-0 python3.9[187599]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:24:05 compute-0 sudo[187596]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:05 compute-0 sudo[187750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrexthofberuoisbxdhccsivqkfvofon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349045.271447-60-213210492018488/AnsiballZ_file.py'
Feb 17 17:24:05 compute-0 sudo[187750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:05 compute-0 python3.9[187753]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:24:05 compute-0 sudo[187750]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:05 compute-0 rsyslogd[1015]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 17 17:24:05 compute-0 rsyslogd[1015]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 17 17:24:06 compute-0 sudo[187904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtmcehlvdpzstnbdgvdzetmictizpzvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349046.0110881-68-244410900251398/AnsiballZ_file.py'
Feb 17 17:24:06 compute-0 sudo[187904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:06 compute-0 python3.9[187907]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:24:06 compute-0 sudo[187904]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:06 compute-0 sudo[188057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqkegibvesolwyeqlvfjzcztkyescuxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349046.575043-77-71067620298743/AnsiballZ_command.py'
Feb 17 17:24:06 compute-0 sudo[188057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:07 compute-0 python3.9[188060]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:24:07 compute-0 sudo[188057]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:07 compute-0 python3.9[188212]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 17 17:24:08 compute-0 sudo[188362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxwxqesguujfvohijtlmsyrftopvknmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349047.9545364-95-229600527943162/AnsiballZ_systemd_service.py'
Feb 17 17:24:08 compute-0 sudo[188362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:08 compute-0 python3.9[188365]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 17 17:24:08 compute-0 systemd[1]: Reloading.
Feb 17 17:24:08 compute-0 systemd-rc-local-generator[188388]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:24:08 compute-0 systemd-sysv-generator[188393]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:24:08 compute-0 sudo[188362]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:09 compute-0 sudo[188557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsegrskojugxqrrnfqnrafuzyrwbwyaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349048.862151-103-175968762081060/AnsiballZ_command.py'
Feb 17 17:24:09 compute-0 sudo[188557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:09 compute-0 python3.9[188560]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:24:09 compute-0 sudo[188557]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:09 compute-0 sudo[188711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thyvhwmjcuymdfccixhalnrwpzjwlkee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349049.4441414-112-29278503664893/AnsiballZ_file.py'
Feb 17 17:24:09 compute-0 sudo[188711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:09 compute-0 python3.9[188714]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:24:09 compute-0 sudo[188711]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:10 compute-0 python3.9[188864]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:24:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:24:10.937 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:24:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:24:10.938 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:24:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:24:10.939 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:24:11 compute-0 sudo[189016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqlqmiqpsxcbeqtpgajrztlwztgdfxxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349050.6774037-128-16863131910806/AnsiballZ_group.py'
Feb 17 17:24:11 compute-0 sudo[189016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:11 compute-0 python3.9[189019]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Feb 17 17:24:11 compute-0 sudo[189016]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:11 compute-0 sudo[189169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyividutfluaortodrbthejmjqsmrhrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349051.5072832-139-202869692167226/AnsiballZ_getent.py'
Feb 17 17:24:11 compute-0 sudo[189169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:12 compute-0 python3.9[189172]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Feb 17 17:24:12 compute-0 sudo[189169]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:12 compute-0 sudo[189323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atulusjfqnbeemcmoejqlfxxrnrcauoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349052.1739218-147-228011943645039/AnsiballZ_group.py'
Feb 17 17:24:12 compute-0 sudo[189323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:12 compute-0 python3.9[189326]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 17 17:24:12 compute-0 groupadd[189327]: group added to /etc/group: name=ceilometer, GID=42405
Feb 17 17:24:12 compute-0 groupadd[189327]: group added to /etc/gshadow: name=ceilometer
Feb 17 17:24:12 compute-0 groupadd[189327]: new group: name=ceilometer, GID=42405
Feb 17 17:24:12 compute-0 sudo[189323]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:13 compute-0 sudo[189497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbdecvwbqncmdswrjryfckpbrsjiwcna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349052.7723937-155-157664723139741/AnsiballZ_user.py'
Feb 17 17:24:13 compute-0 sudo[189497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:13 compute-0 podman[189456]: 2026-02-17 17:24:13.202495645 +0000 UTC m=+0.045680873 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 17 17:24:13 compute-0 python3.9[189504]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 17 17:24:13 compute-0 useradd[189506]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/1
Feb 17 17:24:13 compute-0 useradd[189506]: add 'ceilometer' to group 'libvirt'
Feb 17 17:24:13 compute-0 useradd[189506]: add 'ceilometer' to shadow group 'libvirt'
Feb 17 17:24:13 compute-0 sudo[189497]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:14 compute-0 python3.9[189662]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:24:15 compute-0 python3.9[189783]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771349054.1837428-181-149680109045328/.source.conf _original_basename=ceilometer.conf follow=False checksum=5c6a9288d15d1b05b1484826ce363ad306e9930c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:24:15 compute-0 python3.9[189933]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:24:16 compute-0 python3.9[190054]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771349055.4532878-181-54783669491265/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:24:16 compute-0 python3.9[190204]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:24:17 compute-0 python3.9[190325]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771349056.463365-181-103028208288019/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:24:17 compute-0 python3.9[190475]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:24:18 compute-0 python3.9[190627]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:24:19 compute-0 python3.9[190779]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:24:19 compute-0 python3.9[190900]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771349058.6554315-240-208625786127417/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:24:20 compute-0 python3.9[191050]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:24:20 compute-0 python3.9[191171]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771349059.6688066-240-37770246210715/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:24:21 compute-0 python3.9[191321]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:24:21 compute-0 python3.9[191442]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771349060.7719746-269-268155616734308/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:24:22 compute-0 python3.9[191592]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:24:22 compute-0 python3.9[191713]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771349061.878138-285-204239731462676/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:24:23 compute-0 python3.9[191863]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:24:23 compute-0 python3.9[191984]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771349062.9959447-300-155867993365133/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:24:24 compute-0 python3.9[192134]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:24:24 compute-0 python3.9[192255]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771349064.1125166-315-253494345176889/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:24:25 compute-0 podman[192256]: 2026-02-17 17:24:25.060798401 +0000 UTC m=+0.072072229 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 17 17:24:25 compute-0 sudo[192431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eburqhgmzxrrssukrupunejowqdzzzgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349065.11865-330-144728423527322/AnsiballZ_file.py'
Feb 17 17:24:25 compute-0 sudo[192431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:25 compute-0 python3.9[192434]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:24:25 compute-0 sudo[192431]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:25 compute-0 sudo[192584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foeaqufoswutoipeijfimvtkiaevofzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349065.6844592-338-26865114047298/AnsiballZ_file.py'
Feb 17 17:24:25 compute-0 sudo[192584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:26 compute-0 python3.9[192587]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:24:26 compute-0 sudo[192584]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:26 compute-0 python3.9[192737]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:24:27 compute-0 python3.9[192889]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:24:27 compute-0 python3.9[193041]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:24:28 compute-0 sudo[193193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyhafmcsyytcifazjzenpifaydigrakh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349068.0374637-370-213122626123330/AnsiballZ_file.py'
Feb 17 17:24:28 compute-0 sudo[193193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:28 compute-0 python3.9[193196]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:24:28 compute-0 sudo[193193]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:28 compute-0 sudo[193346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbpcsqzbaygsusffxjxsslhyuakdutqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349068.6585352-378-251261999337169/AnsiballZ_systemd_service.py'
Feb 17 17:24:28 compute-0 sudo[193346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:29 compute-0 python3.9[193349]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:24:29 compute-0 systemd[1]: Reloading.
Feb 17 17:24:29 compute-0 systemd-rc-local-generator[193376]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:24:29 compute-0 systemd-sysv-generator[193382]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:24:29 compute-0 systemd[1]: Listening on Podman API Socket.
Feb 17 17:24:29 compute-0 sudo[193346]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:29 compute-0 sudo[193545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztqsxzwpxfkjlnzaymnytioeepohasil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349069.7647746-387-98676503516402/AnsiballZ_stat.py'
Feb 17 17:24:29 compute-0 sudo[193545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:30 compute-0 python3.9[193548]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:24:30 compute-0 sudo[193545]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:30 compute-0 sudo[193669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlzikylehorndevddehphdpujftpdoym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349069.7647746-387-98676503516402/AnsiballZ_copy.py'
Feb 17 17:24:30 compute-0 sudo[193669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:30 compute-0 python3.9[193672]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771349069.7647746-387-98676503516402/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:24:30 compute-0 sudo[193669]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:30 compute-0 sudo[193746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmfrxwkztqvjzguthccbeeaggrhnjkku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349069.7647746-387-98676503516402/AnsiballZ_stat.py'
Feb 17 17:24:30 compute-0 sudo[193746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:30 compute-0 python3.9[193749]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:24:31 compute-0 sudo[193746]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:31 compute-0 sudo[193870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huhrumtljhygtpgipdjiplfgeczzgngx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349069.7647746-387-98676503516402/AnsiballZ_copy.py'
Feb 17 17:24:31 compute-0 sudo[193870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:31 compute-0 python3.9[193873]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771349069.7647746-387-98676503516402/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:24:31 compute-0 sudo[193870]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:32 compute-0 sudo[194023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbxfagtatdxbcqyfsnmegfgldfocgovu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349071.925841-419-34664561342084/AnsiballZ_file.py'
Feb 17 17:24:32 compute-0 sudo[194023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:32 compute-0 python3.9[194026]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:24:32 compute-0 sudo[194023]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:32 compute-0 sudo[194176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwviucngmymbemzttsdixugudegstpkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349072.4644988-427-261554682895682/AnsiballZ_file.py'
Feb 17 17:24:32 compute-0 sudo[194176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:32 compute-0 python3.9[194179]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:24:32 compute-0 sudo[194176]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:33 compute-0 sudo[194329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnywwysbtsgwhxmvqtybiocseakxoeka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349073.0217168-435-126772840079105/AnsiballZ_stat.py'
Feb 17 17:24:33 compute-0 sudo[194329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:33 compute-0 python3.9[194332]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:24:33 compute-0 sudo[194329]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:33 compute-0 sudo[194453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfemgzlkvmyfgoxmlfzebtbtlvffymeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349073.0217168-435-126772840079105/AnsiballZ_copy.py'
Feb 17 17:24:33 compute-0 sudo[194453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:33 compute-0 python3.9[194456]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771349073.0217168-435-126772840079105/.source.json _original_basename=.idlkbhfv follow=False checksum=ce2b0c83293a970bafffa087afa083dd7c93a79c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:24:33 compute-0 sudo[194453]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:34 compute-0 python3.9[194606]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:24:35 compute-0 nova_compute[186479]: 2026-02-17 17:24:35.098 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:24:35 compute-0 nova_compute[186479]: 2026-02-17 17:24:35.121 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:24:36 compute-0 sudo[195027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpoqcmotndsaknjhfacvwscwxlgfdpxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349075.7746942-475-5405023030290/AnsiballZ_container_config_data.py'
Feb 17 17:24:36 compute-0 sudo[195027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:36 compute-0 python3.9[195030]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_pattern=*.json debug=False
Feb 17 17:24:36 compute-0 sudo[195027]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:36 compute-0 sudo[195180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtxndtjyankribmoodgvudvqjdltfukx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349076.6059296-486-269113648318519/AnsiballZ_container_config_hash.py'
Feb 17 17:24:36 compute-0 sudo[195180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:37 compute-0 python3.9[195183]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 17 17:24:37 compute-0 sudo[195180]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:37 compute-0 sudo[195335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djhjctkeywmwsznfcagftnwpaggmrvdz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771349077.4075704-496-163166417300247/AnsiballZ_edpm_container_manage.py'
Feb 17 17:24:37 compute-0 sudo[195335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:38 compute-0 python3[195338]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_id=ceilometer_agent_compute config_overrides={} config_patterns=*.json containers=['ceilometer_agent_compute'] log_base_path=/var/log/containers/stdouts debug=False
Feb 17 17:24:38 compute-0 podman[195373]: 2026-02-17 17:24:38.20455723 +0000 UTC m=+0.048998562 container create f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 17 17:24:38 compute-0 podman[195373]: 2026-02-17 17:24:38.177824985 +0000 UTC m=+0.022266227 image pull be811c7ef606e5fdf21f4bb60e867487043c4ca0ef316c864692549ee6c1c369 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Feb 17 17:24:38 compute-0 python3[195338]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535 --healthcheck-command /openstack/healthcheck compute --label config_id=ceilometer_agent_compute --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Feb 17 17:24:38 compute-0 sudo[195335]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:38 compute-0 sshd-session[195328]: Received disconnect from 195.178.110.15 port 63836:11:  [preauth]
Feb 17 17:24:38 compute-0 sshd-session[195328]: Disconnected from authenticating user root 195.178.110.15 port 63836 [preauth]
Feb 17 17:24:38 compute-0 sudo[195561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiibyrmyznpxvkhrwamscanqdoumrsru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349078.4690192-504-219309200461098/AnsiballZ_stat.py'
Feb 17 17:24:38 compute-0 sudo[195561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:38 compute-0 python3.9[195564]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:24:38 compute-0 sudo[195561]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:39 compute-0 sudo[195716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxkybzeijtaamsgpyftnztdouixivpet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349079.154644-513-35086354085375/AnsiballZ_file.py'
Feb 17 17:24:39 compute-0 sudo[195716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:39 compute-0 python3.9[195719]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:24:39 compute-0 sudo[195716]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:39 compute-0 sudo[195793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djnfhtfsfzhjfrimixfopnhbqzylzevw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349079.154644-513-35086354085375/AnsiballZ_stat.py'
Feb 17 17:24:39 compute-0 sudo[195793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:39 compute-0 python3.9[195796]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:24:39 compute-0 sudo[195793]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:40 compute-0 sudo[195945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suiecsqrjakrudwbslnavlwksikobqpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349080.0210454-513-180154172065857/AnsiballZ_copy.py'
Feb 17 17:24:40 compute-0 sudo[195945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:40 compute-0 python3.9[195948]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771349080.0210454-513-180154172065857/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:24:40 compute-0 sudo[195945]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:41 compute-0 sudo[196022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atldfojgbhjtujkmkjxjeesdfjjswsbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349080.0210454-513-180154172065857/AnsiballZ_systemd.py'
Feb 17 17:24:41 compute-0 sudo[196022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:41 compute-0 python3.9[196025]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 17 17:24:41 compute-0 systemd[1]: Reloading.
Feb 17 17:24:41 compute-0 systemd-sysv-generator[196055]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:24:41 compute-0 systemd-rc-local-generator[196047]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:24:41 compute-0 sudo[196022]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:41 compute-0 sudo[196140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdobjdpprqgkmedfxhjkyztcfmkmbmax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349080.0210454-513-180154172065857/AnsiballZ_systemd.py'
Feb 17 17:24:41 compute-0 sudo[196140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:42 compute-0 python3.9[196143]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:24:42 compute-0 systemd[1]: Reloading.
Feb 17 17:24:42 compute-0 systemd-rc-local-generator[196171]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:24:42 compute-0 systemd-sysv-generator[196175]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:24:42 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Feb 17 17:24:42 compute-0 systemd[1]: Started libcrun container.
Feb 17 17:24:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9509a947a75a59a24222b7a400dbfaf0eca158ae9575a005cd279f6ea1d438f3/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Feb 17 17:24:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9509a947a75a59a24222b7a400dbfaf0eca158ae9575a005cd279f6ea1d438f3/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 17 17:24:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9509a947a75a59a24222b7a400dbfaf0eca158ae9575a005cd279f6ea1d438f3/merged/var/lib/kolla/config_files/src supports timestamps until 2038 (0x7fffffff)
Feb 17 17:24:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9509a947a75a59a24222b7a400dbfaf0eca158ae9575a005cd279f6ea1d438f3/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Feb 17 17:24:42 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1.
Feb 17 17:24:42 compute-0 podman[196190]: 2026-02-17 17:24:42.691502508 +0000 UTC m=+0.129275157 container init f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: + sudo -E kolla_set_configs
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: sudo: unable to send audit message: Operation not permitted
Feb 17 17:24:42 compute-0 sudo[196211]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 17 17:24:42 compute-0 sudo[196211]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Feb 17 17:24:42 compute-0 sudo[196211]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Feb 17 17:24:42 compute-0 podman[196190]: 2026-02-17 17:24:42.725366934 +0000 UTC m=+0.163139573 container start f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 17 17:24:42 compute-0 podman[196190]: ceilometer_agent_compute
Feb 17 17:24:42 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: INFO:__main__:Validating config file
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: INFO:__main__:Copying service configuration files
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: INFO:__main__:Copying /var/lib/kolla/config_files/src/polling.yaml to /etc/ceilometer/polling.yaml
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: INFO:__main__:Copying /var/lib/kolla/config_files/src/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: INFO:__main__:Writing out command to execute
Feb 17 17:24:42 compute-0 sudo[196211]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: ++ cat /run_command
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: + ARGS=
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: + sudo kolla_copy_cacerts
Feb 17 17:24:42 compute-0 sudo[196140]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:42 compute-0 sudo[196232]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: sudo: unable to send audit message: Operation not permitted
Feb 17 17:24:42 compute-0 sudo[196232]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Feb 17 17:24:42 compute-0 sudo[196232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Feb 17 17:24:42 compute-0 podman[196212]: 2026-02-17 17:24:42.778972865 +0000 UTC m=+0.046493721 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:24:42 compute-0 sudo[196232]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: + [[ ! -n '' ]]
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: + . kolla_extend_start
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: + umask 0022
Feb 17 17:24:42 compute-0 ceilometer_agent_compute[196205]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Feb 17 17:24:42 compute-0 systemd[1]: f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1-47479e77ee6cda40.service: Main process exited, code=exited, status=1/FAILURE
Feb 17 17:24:42 compute-0 systemd[1]: f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1-47479e77ee6cda40.service: Failed with result 'exit-code'.
Feb 17 17:24:43 compute-0 podman[196363]: 2026-02-17 17:24:43.426739079 +0000 UTC m=+0.059371402 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.489 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.489 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.490 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.490 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.490 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.490 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.490 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.490 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.490 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.490 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.490 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.490 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.490 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.491 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.491 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.491 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.491 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.491 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.491 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.491 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.491 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.491 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.491 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.491 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.491 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.492 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.492 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.492 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.492 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.492 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.492 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.492 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.492 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.492 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.492 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.492 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.492 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.492 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.493 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.493 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.493 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.493 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.493 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.493 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.493 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.493 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.493 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.493 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.493 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.493 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.493 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.494 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.494 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.494 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.494 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.494 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.494 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.494 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.494 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.494 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.494 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.494 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.494 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.494 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.495 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.495 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.495 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.495 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.495 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.495 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.495 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.495 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.495 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.495 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.495 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.495 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.496 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.496 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.496 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.496 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.496 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.496 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.496 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.496 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.496 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.496 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.496 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.496 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.497 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.497 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.497 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.497 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.497 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.497 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.497 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.497 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.497 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.497 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.497 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.497 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.497 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.498 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.498 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.498 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.498 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.498 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.498 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.498 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.498 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.498 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.498 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.498 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.499 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.499 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.499 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.499 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.499 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.499 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.499 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.499 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.499 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.499 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.499 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.499 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.499 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.500 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.500 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.500 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.500 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.500 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.500 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.500 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.500 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.500 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.500 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.500 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.500 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.501 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.501 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.501 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.501 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.501 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.501 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.501 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.501 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.501 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.501 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.501 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.501 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.502 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.502 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.502 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.502 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.502 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.502 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.502 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.502 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.502 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.502 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.502 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.502 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.502 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.503 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.503 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.503 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.520 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.522 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.522 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Feb 17 17:24:43 compute-0 python3.9[196399]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.606 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.678 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.679 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.679 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.679 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.679 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.679 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.679 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.679 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.679 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.679 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.680 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.680 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.680 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.680 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.680 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.680 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.680 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.680 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.681 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.681 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.681 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.681 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.681 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.681 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.681 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.681 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.681 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.681 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.681 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.682 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.682 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.682 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.682 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.682 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.682 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.682 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.682 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.682 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.682 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.682 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.683 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.683 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.683 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.683 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.683 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.683 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.683 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.683 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.683 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.683 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.684 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.684 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.684 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.684 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.684 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.684 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.684 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.684 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.684 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.684 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.684 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.685 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.685 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.685 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.685 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.685 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.685 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.685 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.685 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.685 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.685 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.685 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.686 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.686 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.686 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.686 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.686 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.686 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.686 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.686 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.686 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.686 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.686 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.687 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.687 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.687 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.687 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.687 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.687 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.687 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.687 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.687 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.687 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.687 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.688 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.688 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.688 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.688 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.688 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.688 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.688 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.688 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.688 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.688 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.689 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.689 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.689 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.689 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.689 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.689 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.689 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.689 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.689 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.689 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.690 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.690 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.690 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.690 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.690 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.690 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.690 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.690 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.690 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.691 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.691 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.691 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.691 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.691 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.691 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.691 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.691 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.691 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.691 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.692 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.692 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.692 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.692 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.692 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.692 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.692 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.692 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.692 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.693 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.693 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.693 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.693 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.693 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.693 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.693 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.693 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.693 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.693 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.694 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.694 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.694 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.694 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.694 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.694 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.694 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.694 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.694 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.694 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.695 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.695 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.695 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.695 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.695 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.695 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.695 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.695 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.695 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.695 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.696 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 rsyslogd[1015]: imjournal from <np0005622237:ceilometer_agent_compute>: begin to drop messages due to rate-limiting
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.696 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.696 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.696 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.696 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.696 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.696 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.696 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.696 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.696 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.697 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.697 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.697 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.697 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.697 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.697 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.697 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.697 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.697 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.697 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.698 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.698 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.698 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.698 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.698 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.698 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.698 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.698 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.698 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.698 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.699 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.699 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.699 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.699 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.699 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.699 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.699 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.699 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.699 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.699 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.699 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.700 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.700 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.700 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.700 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.700 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.700 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.703 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.711 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:24:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:24:43 compute-0 sshd-session[196228]: Invalid user admin from 209.38.233.161 port 33432
Feb 17 17:24:44 compute-0 sshd-session[196228]: Connection closed by invalid user admin 209.38.233.161 port 33432 [preauth]
Feb 17 17:24:44 compute-0 sudo[196561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnziwyqmagwukjpwnrjgnrkdaunbeamc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349083.9737256-558-204638094861356/AnsiballZ_stat.py'
Feb 17 17:24:44 compute-0 sudo[196561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:44 compute-0 python3.9[196564]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:24:44 compute-0 sudo[196561]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:44 compute-0 sudo[196687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkcfwdanxkgdjndrtyknkgrgskzfeipy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349083.9737256-558-204638094861356/AnsiballZ_copy.py'
Feb 17 17:24:44 compute-0 sudo[196687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:44 compute-0 python3.9[196690]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771349083.9737256-558-204638094861356/.source.yaml _original_basename=.5wc7s0mo follow=False checksum=1b37b4ae5baf93409c338b80db8a2575adf0ca5f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:24:44 compute-0 sudo[196687]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:45 compute-0 sudo[196840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytjjogbgyrjrbhmjctzfahyrbxlrkunp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349085.0027277-573-53163884389913/AnsiballZ_stat.py'
Feb 17 17:24:45 compute-0 sudo[196840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:45 compute-0 python3.9[196843]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:24:45 compute-0 sudo[196840]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:45 compute-0 sudo[196964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyeomxitiikbupwpattctxdgfgrksipp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349085.0027277-573-53163884389913/AnsiballZ_copy.py'
Feb 17 17:24:45 compute-0 sudo[196964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:45 compute-0 python3.9[196967]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771349085.0027277-573-53163884389913/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:24:45 compute-0 sudo[196964]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:46 compute-0 sudo[197117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yeegzbciauhpcynppqllwwufqnvffslq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349086.4439237-594-224268698349592/AnsiballZ_file.py'
Feb 17 17:24:46 compute-0 sudo[197117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:46 compute-0 python3.9[197120]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:24:46 compute-0 sudo[197117]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:47 compute-0 sudo[197270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efnaoqldrgqczhdxbkgvjburmmxusdkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349087.0189762-602-89403084709489/AnsiballZ_file.py'
Feb 17 17:24:47 compute-0 sudo[197270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:47 compute-0 python3.9[197273]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:24:47 compute-0 sudo[197270]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:47 compute-0 sudo[197423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzqcmkmkkagxewjcpjlthdahsmjulxvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349087.547253-610-179745851374474/AnsiballZ_stat.py'
Feb 17 17:24:47 compute-0 sudo[197423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:47 compute-0 python3.9[197426]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:24:47 compute-0 sudo[197423]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:48 compute-0 sudo[197502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqcytkxnzljrurfrtkcwxkjahzhibbes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349087.547253-610-179745851374474/AnsiballZ_file.py'
Feb 17 17:24:48 compute-0 sudo[197502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:48 compute-0 python3.9[197505]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.93hqnkoc recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:24:48 compute-0 sudo[197502]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:48 compute-0 python3.9[197655]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/node_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:24:50 compute-0 sudo[198076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mviebetldzwgltfqgvydxltsodlnifko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349090.2052052-647-249283358837908/AnsiballZ_container_config_data.py'
Feb 17 17:24:50 compute-0 sudo[198076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:50 compute-0 python3.9[198079]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/node_exporter config_pattern=*.json debug=False
Feb 17 17:24:50 compute-0 sudo[198076]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:51 compute-0 sudo[198229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvjwjmlrrugpossvrokrhajeslqoiogw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349090.9326892-658-129253829300908/AnsiballZ_container_config_hash.py'
Feb 17 17:24:51 compute-0 sudo[198229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:51 compute-0 python3.9[198232]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 17 17:24:51 compute-0 sudo[198229]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:51 compute-0 sudo[198382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjgznfeajzgaidnfohwmopzvggnkqiyd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771349091.610121-668-247703608225233/AnsiballZ_edpm_container_manage.py'
Feb 17 17:24:51 compute-0 sudo[198382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:52 compute-0 python3[198385]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/node_exporter config_id=node_exporter config_overrides={} config_patterns=*.json containers=['node_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 17 17:24:52 compute-0 podman[198422]: 2026-02-17 17:24:52.222456069 +0000 UTC m=+0.040807605 container create 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=node_exporter, container_name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 17 17:24:52 compute-0 podman[198422]: 2026-02-17 17:24:52.199773282 +0000 UTC m=+0.018124848 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Feb 17 17:24:52 compute-0 python3[198385]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535 --healthcheck-command /openstack/healthcheck node_exporter --label config_id=node_exporter --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Feb 17 17:24:52 compute-0 sudo[198382]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:52 compute-0 sudo[198611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zggdqtkbghyaukqmgndbgezpcdfybgcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349092.5470324-676-43339539156859/AnsiballZ_stat.py'
Feb 17 17:24:52 compute-0 sudo[198611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:53 compute-0 python3.9[198614]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:24:53 compute-0 sudo[198611]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.305 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.305 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.306 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.332 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.332 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.333 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.333 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.334 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.334 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.334 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.334 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.335 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.362 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.362 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.362 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.362 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:24:53 compute-0 sudo[198766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwoggeibknzlbpytjtxkamnpquywdnmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349093.2573657-685-185781168782062/AnsiballZ_file.py'
Feb 17 17:24:53 compute-0 sudo[198766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.505 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.506 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6043MB free_disk=73.43315124511719GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.506 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.507 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.589 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.589 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.613 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.628 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.630 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:24:53 compute-0 nova_compute[186479]: 2026-02-17 17:24:53.631 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:24:53 compute-0 python3.9[198769]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:24:53 compute-0 sudo[198766]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:53 compute-0 sudo[198843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnqnhpgcomkgaxzzaurbqzsdztpqbsbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349093.2573657-685-185781168782062/AnsiballZ_stat.py'
Feb 17 17:24:53 compute-0 sudo[198843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:54 compute-0 python3.9[198846]: ansible-stat Invoked with path=/etc/systemd/system/edpm_node_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:24:54 compute-0 sudo[198843]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:54 compute-0 sudo[198995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsrxdlhpdcigwegipwtpxcnmwzqrzzzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349094.0692096-685-142693539863972/AnsiballZ_copy.py'
Feb 17 17:24:54 compute-0 sudo[198995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:54 compute-0 python3.9[198998]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771349094.0692096-685-142693539863972/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:24:54 compute-0 sudo[198995]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:54 compute-0 sudo[199072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhgxhlakgofhvdssaiorbqbgkejyrifp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349094.0692096-685-142693539863972/AnsiballZ_systemd.py'
Feb 17 17:24:54 compute-0 sudo[199072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:55 compute-0 python3.9[199075]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 17 17:24:55 compute-0 systemd[1]: Reloading.
Feb 17 17:24:55 compute-0 systemd-rc-local-generator[199122]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:24:55 compute-0 systemd-sysv-generator[199128]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:24:55 compute-0 podman[199077]: 2026-02-17 17:24:55.266918869 +0000 UTC m=+0.091721112 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:24:55 compute-0 sudo[199072]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:55 compute-0 sudo[199218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khugijjdwgsyvphrkewwfskgizyjggfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349094.0692096-685-142693539863972/AnsiballZ_systemd.py'
Feb 17 17:24:55 compute-0 sudo[199218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:55 compute-0 python3.9[199221]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:24:55 compute-0 systemd[1]: Reloading.
Feb 17 17:24:56 compute-0 systemd-rc-local-generator[199246]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:24:56 compute-0 systemd-sysv-generator[199255]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:24:56 compute-0 systemd[1]: Starting node_exporter container...
Feb 17 17:24:56 compute-0 systemd[1]: Started libcrun container.
Feb 17 17:24:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfb14340eb8db3050e019b75aa2cbd5c3918cc92a24504c547ec08aee5ad865b/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 17 17:24:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfb14340eb8db3050e019b75aa2cbd5c3918cc92a24504c547ec08aee5ad865b/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Feb 17 17:24:56 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12.
Feb 17 17:24:56 compute-0 podman[199268]: 2026-02-17 17:24:56.376522243 +0000 UTC m=+0.125186918 container init 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.390Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.390Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.390Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.391Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.391Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.391Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.391Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.391Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.391Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.391Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.391Z caller=node_exporter.go:117 level=info collector=arp
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.391Z caller=node_exporter.go:117 level=info collector=bcache
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.391Z caller=node_exporter.go:117 level=info collector=bonding
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.391Z caller=node_exporter.go:117 level=info collector=btrfs
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.391Z caller=node_exporter.go:117 level=info collector=conntrack
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.391Z caller=node_exporter.go:117 level=info collector=cpu
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.391Z caller=node_exporter.go:117 level=info collector=cpufreq
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.391Z caller=node_exporter.go:117 level=info collector=diskstats
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.391Z caller=node_exporter.go:117 level=info collector=edac
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.391Z caller=node_exporter.go:117 level=info collector=fibrechannel
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.391Z caller=node_exporter.go:117 level=info collector=filefd
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.391Z caller=node_exporter.go:117 level=info collector=filesystem
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.391Z caller=node_exporter.go:117 level=info collector=infiniband
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.391Z caller=node_exporter.go:117 level=info collector=ipvs
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.392Z caller=node_exporter.go:117 level=info collector=loadavg
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.392Z caller=node_exporter.go:117 level=info collector=mdadm
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.392Z caller=node_exporter.go:117 level=info collector=meminfo
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.392Z caller=node_exporter.go:117 level=info collector=netclass
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.392Z caller=node_exporter.go:117 level=info collector=netdev
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.392Z caller=node_exporter.go:117 level=info collector=netstat
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.392Z caller=node_exporter.go:117 level=info collector=nfs
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.392Z caller=node_exporter.go:117 level=info collector=nfsd
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.392Z caller=node_exporter.go:117 level=info collector=nvme
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.392Z caller=node_exporter.go:117 level=info collector=schedstat
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.392Z caller=node_exporter.go:117 level=info collector=sockstat
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.392Z caller=node_exporter.go:117 level=info collector=softnet
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.392Z caller=node_exporter.go:117 level=info collector=systemd
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.392Z caller=node_exporter.go:117 level=info collector=tapestats
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.392Z caller=node_exporter.go:117 level=info collector=udp_queues
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.392Z caller=node_exporter.go:117 level=info collector=vmstat
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.392Z caller=node_exporter.go:117 level=info collector=xfs
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.392Z caller=node_exporter.go:117 level=info collector=zfs
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.392Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Feb 17 17:24:56 compute-0 node_exporter[199284]: ts=2026-02-17T17:24:56.392Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Feb 17 17:24:56 compute-0 podman[199268]: 2026-02-17 17:24:56.406830954 +0000 UTC m=+0.155495609 container start 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 17 17:24:56 compute-0 podman[199268]: node_exporter
Feb 17 17:24:56 compute-0 systemd[1]: Started node_exporter container.
Feb 17 17:24:56 compute-0 sudo[199218]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:56 compute-0 podman[199293]: 2026-02-17 17:24:56.48383537 +0000 UTC m=+0.060372086 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 17 17:24:57 compute-0 python3.9[199467]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 17 17:24:57 compute-0 sudo[199617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azffufywcocskwhovuurbqnsskblrtat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349097.45967-730-269722601361076/AnsiballZ_stat.py'
Feb 17 17:24:57 compute-0 sudo[199617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:57 compute-0 python3.9[199620]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:24:57 compute-0 sudo[199617]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:58 compute-0 sudo[199743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quuquytdentrkdofblrafbjxyouzgngv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349097.45967-730-269722601361076/AnsiballZ_copy.py'
Feb 17 17:24:58 compute-0 sudo[199743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:58 compute-0 python3.9[199746]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771349097.45967-730-269722601361076/.source.yaml _original_basename=.0aru4fd0 follow=False checksum=013c2068b201bf0a722c3ccf783324d8575433ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:24:58 compute-0 sudo[199743]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:58 compute-0 sudo[199896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuswiqwacpdlgxoxrqcmmkktrgrxxldh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349098.5202894-745-104243079181887/AnsiballZ_stat.py'
Feb 17 17:24:58 compute-0 sudo[199896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:58 compute-0 python3.9[199899]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:24:58 compute-0 sudo[199896]: pam_unix(sudo:session): session closed for user root
Feb 17 17:24:59 compute-0 sudo[200020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etzgcofzrslnhxpgtbcwlwpgxaefesqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349098.5202894-745-104243079181887/AnsiballZ_copy.py'
Feb 17 17:24:59 compute-0 sudo[200020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:24:59 compute-0 python3.9[200023]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771349098.5202894-745-104243079181887/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:24:59 compute-0 sudo[200020]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:00 compute-0 sudo[200173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhttjowncenwwuxavvojpwxemyoqrpod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349099.980574-766-15481332748103/AnsiballZ_file.py'
Feb 17 17:25:00 compute-0 sudo[200173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:00 compute-0 python3.9[200176]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:00 compute-0 sudo[200173]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:00 compute-0 sudo[200326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rekkyjpsblogmqogssuxkhnyefstnmzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349100.6985962-774-260025231999746/AnsiballZ_file.py'
Feb 17 17:25:00 compute-0 sudo[200326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:01 compute-0 python3.9[200329]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:25:01 compute-0 sudo[200326]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:02 compute-0 sudo[200479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yteamxjjnphysrlshltchyubyxtukgaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349101.3196857-782-170098635690961/AnsiballZ_stat.py'
Feb 17 17:25:02 compute-0 sudo[200479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:03 compute-0 python3.9[200482]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:25:03 compute-0 sudo[200479]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:03 compute-0 sudo[200558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtbfzdocbgkshnahgyrqrrzwkybdsrth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349101.3196857-782-170098635690961/AnsiballZ_file.py'
Feb 17 17:25:03 compute-0 sudo[200558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:03 compute-0 python3.9[200561]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.2had9iyf recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:03 compute-0 sudo[200558]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:04 compute-0 python3.9[200711]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:06 compute-0 sudo[201132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rknzxchkwuysvaiwqhrihgzdwdcmpfbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349105.4602122-819-205721992746375/AnsiballZ_container_config_data.py'
Feb 17 17:25:06 compute-0 sudo[201132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:06 compute-0 python3.9[201135]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Feb 17 17:25:06 compute-0 sudo[201132]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:08 compute-0 sudo[201285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzirlkokdvaytpoongftdkuudszarlcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349107.617355-830-111408547827821/AnsiballZ_container_config_hash.py'
Feb 17 17:25:08 compute-0 sudo[201285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:08 compute-0 python3.9[201288]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 17 17:25:08 compute-0 sudo[201285]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:09 compute-0 sudo[201438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evexlxgydlhtoswmozbtnlijysgqepjz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771349109.4844646-840-239028610394166/AnsiballZ_edpm_container_manage.py'
Feb 17 17:25:09 compute-0 sudo[201438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:09 compute-0 python3[201441]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 17 17:25:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:25:10.940 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:25:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:25:10.941 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:25:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:25:10.941 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:25:12 compute-0 podman[201454]: 2026-02-17 17:25:12.205917465 +0000 UTC m=+2.251905399 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Feb 17 17:25:12 compute-0 podman[201549]: 2026-02-17 17:25:12.312940944 +0000 UTC m=+0.044208357 container create 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=podman_exporter, container_name=podman_exporter)
Feb 17 17:25:12 compute-0 podman[201549]: 2026-02-17 17:25:12.287057101 +0000 UTC m=+0.018324504 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Feb 17 17:25:12 compute-0 python3[201441]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Feb 17 17:25:12 compute-0 sudo[201438]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:12 compute-0 sudo[201738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryiwynenxbdmshswrbehlggwplbcahut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349112.551569-848-96434346784329/AnsiballZ_stat.py'
Feb 17 17:25:12 compute-0 sudo[201738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:12 compute-0 python3.9[201741]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:25:12 compute-0 sudo[201738]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:13 compute-0 podman[201744]: 2026-02-17 17:25:13.01684966 +0000 UTC m=+0.042430314 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 17 17:25:13 compute-0 systemd[1]: f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1-47479e77ee6cda40.service: Main process exited, code=exited, status=1/FAILURE
Feb 17 17:25:13 compute-0 systemd[1]: f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1-47479e77ee6cda40.service: Failed with result 'exit-code'.
Feb 17 17:25:13 compute-0 sudo[201913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkyvdapgifrymiynimejvigcommktstx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349113.1400487-857-90330550642886/AnsiballZ_file.py'
Feb 17 17:25:13 compute-0 sudo[201913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:13 compute-0 python3.9[201916]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:13 compute-0 sudo[201913]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:13 compute-0 podman[201917]: 2026-02-17 17:25:13.70351065 +0000 UTC m=+0.083478932 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Feb 17 17:25:13 compute-0 sudo[202009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncvrrwskrzfchwvpsnxoecejdmdbrwbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349113.1400487-857-90330550642886/AnsiballZ_stat.py'
Feb 17 17:25:13 compute-0 sudo[202009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:13 compute-0 python3.9[202012]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:25:13 compute-0 sudo[202009]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:14 compute-0 sudo[202161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fifdemtcmdcwponuvvggxgrqbqmfnuev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349114.0093503-857-177799845356754/AnsiballZ_copy.py'
Feb 17 17:25:14 compute-0 sudo[202161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:14 compute-0 python3.9[202164]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771349114.0093503-857-177799845356754/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:14 compute-0 sudo[202161]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:14 compute-0 sudo[202238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcznqcpshbyyijcczzzjtwshqbjxtbam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349114.0093503-857-177799845356754/AnsiballZ_systemd.py'
Feb 17 17:25:14 compute-0 sudo[202238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:15 compute-0 python3.9[202241]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 17 17:25:15 compute-0 systemd[1]: Reloading.
Feb 17 17:25:15 compute-0 systemd-sysv-generator[202264]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:25:15 compute-0 systemd-rc-local-generator[202259]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:25:15 compute-0 sudo[202238]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:15 compute-0 sudo[202357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyairryoiljhbegtfeblddzgnnsesyah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349114.0093503-857-177799845356754/AnsiballZ_systemd.py'
Feb 17 17:25:15 compute-0 sudo[202357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:15 compute-0 python3.9[202360]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:25:15 compute-0 systemd[1]: Reloading.
Feb 17 17:25:16 compute-0 systemd-rc-local-generator[202387]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:25:16 compute-0 systemd-sysv-generator[202392]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:25:16 compute-0 systemd[1]: Starting podman_exporter container...
Feb 17 17:25:16 compute-0 systemd[1]: Started libcrun container.
Feb 17 17:25:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1968608f65208bb1cec89ed01cc3418e50a572d3be23abfd8a2961ab4a170ba7/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 17 17:25:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1968608f65208bb1cec89ed01cc3418e50a572d3be23abfd8a2961ab4a170ba7/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Feb 17 17:25:16 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6.
Feb 17 17:25:16 compute-0 podman[202406]: 2026-02-17 17:25:16.331669986 +0000 UTC m=+0.108875215 container init 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 17 17:25:16 compute-0 podman_exporter[202421]: ts=2026-02-17T17:25:16.347Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Feb 17 17:25:16 compute-0 podman_exporter[202421]: ts=2026-02-17T17:25:16.347Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Feb 17 17:25:16 compute-0 podman_exporter[202421]: ts=2026-02-17T17:25:16.347Z caller=handler.go:94 level=info msg="enabled collectors"
Feb 17 17:25:16 compute-0 podman_exporter[202421]: ts=2026-02-17T17:25:16.347Z caller=handler.go:105 level=info collector=container
Feb 17 17:25:16 compute-0 podman[202406]: 2026-02-17 17:25:16.367122991 +0000 UTC m=+0.144328190 container start 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 17 17:25:16 compute-0 podman[202406]: podman_exporter
Feb 17 17:25:16 compute-0 systemd[1]: Starting Podman API Service...
Feb 17 17:25:16 compute-0 systemd[1]: Started podman_exporter container.
Feb 17 17:25:16 compute-0 systemd[1]: Started Podman API Service.
Feb 17 17:25:16 compute-0 sudo[202357]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:16 compute-0 podman[202437]: time="2026-02-17T17:25:16Z" level=info msg="/usr/bin/podman filtering at log level info"
Feb 17 17:25:16 compute-0 podman[202437]: time="2026-02-17T17:25:16Z" level=info msg="Setting parallel job count to 25"
Feb 17 17:25:16 compute-0 podman[202437]: time="2026-02-17T17:25:16Z" level=info msg="Using sqlite as database backend"
Feb 17 17:25:16 compute-0 podman[202437]: time="2026-02-17T17:25:16Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Feb 17 17:25:16 compute-0 podman[202437]: time="2026-02-17T17:25:16Z" level=info msg="Using systemd socket activation to determine API endpoint"
Feb 17 17:25:16 compute-0 podman[202437]: time="2026-02-17T17:25:16Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Feb 17 17:25:16 compute-0 podman[202437]: @ - - [17/Feb/2026:17:25:16 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Feb 17 17:25:16 compute-0 podman[202437]: time="2026-02-17T17:25:16Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 17 17:25:16 compute-0 podman[202431]: 2026-02-17 17:25:16.426922472 +0000 UTC m=+0.054802152 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 17 17:25:16 compute-0 systemd[1]: 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6-7f278864fd6326e8.service: Main process exited, code=exited, status=1/FAILURE
Feb 17 17:25:16 compute-0 systemd[1]: 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6-7f278864fd6326e8.service: Failed with result 'exit-code'.
Feb 17 17:25:16 compute-0 podman[202437]: @ - - [17/Feb/2026:17:25:16 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 18539 "" "Go-http-client/1.1"
Feb 17 17:25:16 compute-0 podman_exporter[202421]: ts=2026-02-17T17:25:16.439Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Feb 17 17:25:16 compute-0 podman_exporter[202421]: ts=2026-02-17T17:25:16.440Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Feb 17 17:25:16 compute-0 podman_exporter[202421]: ts=2026-02-17T17:25:16.440Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Feb 17 17:25:16 compute-0 python3.9[202616]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 17 17:25:17 compute-0 sudo[202766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogxafrwzzhrisqodgukmtcnvnkqyevmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349117.4007394-902-184294157054034/AnsiballZ_stat.py'
Feb 17 17:25:17 compute-0 sudo[202766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:17 compute-0 python3.9[202769]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:25:17 compute-0 sudo[202766]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:18 compute-0 sudo[202892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecckltpihhdoiemhjgwgprdkcczugkak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349117.4007394-902-184294157054034/AnsiballZ_copy.py'
Feb 17 17:25:18 compute-0 sudo[202892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:18 compute-0 python3.9[202895]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771349117.4007394-902-184294157054034/.source.yaml _original_basename=.139nejc0 follow=False checksum=2e9bcec3872bb2af2f8817cb4c0f82d2f284e266 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:18 compute-0 sudo[202892]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:18 compute-0 sudo[203045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxbbwcfccaywsfwzciweqfunkddmojwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349118.3884196-917-18763542282330/AnsiballZ_stat.py'
Feb 17 17:25:18 compute-0 sudo[203045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:18 compute-0 python3.9[203048]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:25:18 compute-0 sudo[203045]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:19 compute-0 sudo[203169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzghkvcnuyevwldgsegorukbqidrudbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349118.3884196-917-18763542282330/AnsiballZ_copy.py'
Feb 17 17:25:19 compute-0 sudo[203169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:19 compute-0 python3.9[203172]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771349118.3884196-917-18763542282330/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:25:19 compute-0 sudo[203169]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:19 compute-0 sudo[203322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxirpshdyibhsezjnnujpvvqyjbxjawm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349119.6975734-938-173491614127237/AnsiballZ_file.py'
Feb 17 17:25:19 compute-0 sudo[203322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:20 compute-0 python3.9[203325]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:20 compute-0 sudo[203322]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:20 compute-0 auditd[717]: Audit daemon rotating log files
Feb 17 17:25:20 compute-0 sudo[203475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cepfufxqrhartzywduqhwiodetwimvfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349120.2637644-946-263984674542224/AnsiballZ_file.py'
Feb 17 17:25:20 compute-0 sudo[203475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:20 compute-0 python3.9[203478]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 17 17:25:20 compute-0 sudo[203475]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:21 compute-0 sudo[203628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auccudozavycxqrnqgalivvlddzkbpvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349120.860828-954-214243466435677/AnsiballZ_stat.py'
Feb 17 17:25:21 compute-0 sudo[203628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:21 compute-0 python3.9[203631]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:25:21 compute-0 sudo[203628]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:21 compute-0 sudo[203707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnohsxyjfahzfptnuzxsddfttnkduwzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349120.860828-954-214243466435677/AnsiballZ_file.py'
Feb 17 17:25:21 compute-0 sudo[203707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:21 compute-0 python3.9[203710]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.t0abrcoq recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:21 compute-0 sudo[203707]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:22 compute-0 python3.9[203860]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:23 compute-0 sudo[204281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvbdrlabhzvyllzbqlwpgqsxoblgavbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349123.535209-991-255037075128919/AnsiballZ_container_config_data.py'
Feb 17 17:25:23 compute-0 sudo[204281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:23 compute-0 python3.9[204284]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Feb 17 17:25:23 compute-0 sudo[204281]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:24 compute-0 sudo[204434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-meofuihyffluoikgbwfbfxprjgcgoqld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349124.248437-1002-237579450953186/AnsiballZ_container_config_hash.py'
Feb 17 17:25:24 compute-0 sudo[204434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:24 compute-0 python3.9[204437]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 17 17:25:24 compute-0 sudo[204434]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:25 compute-0 sudo[204587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnaxcnuwqxrauloxwptnivqbjnbtuzei ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771349124.9948514-1012-279267022191504/AnsiballZ_edpm_container_manage.py'
Feb 17 17:25:25 compute-0 sudo[204587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:25 compute-0 python3[204590]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 17 17:25:25 compute-0 podman[204615]: 2026-02-17 17:25:25.752544335 +0000 UTC m=+0.089888318 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Feb 17 17:25:27 compute-0 podman[204658]: 2026-02-17 17:25:27.468653007 +0000 UTC m=+0.817292630 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 17 17:25:27 compute-0 podman[204602]: 2026-02-17 17:25:27.953509394 +0000 UTC m=+2.436980439 image pull ba17a9079b86bdce2cd9e03cad5d6d4d255cc298efd741b09239c34192e5621b quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 17 17:25:28 compute-0 podman[204752]: 2026-02-17 17:25:28.07618185 +0000 UTC m=+0.049126465 container create 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1770267347, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 17 17:25:28 compute-0 podman[204752]: 2026-02-17 17:25:28.047254823 +0000 UTC m=+0.020199528 image pull ba17a9079b86bdce2cd9e03cad5d6d4d255cc298efd741b09239c34192e5621b quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 17 17:25:28 compute-0 python3[204590]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 17 17:25:29 compute-0 sudo[204587]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:29 compute-0 sudo[204940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvnlwarwyhnrwarzlqiyorcmnlzluzvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349129.3773563-1020-123568832256191/AnsiballZ_stat.py'
Feb 17 17:25:29 compute-0 sudo[204940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:29 compute-0 python3.9[204943]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:25:29 compute-0 sudo[204940]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:30 compute-0 sudo[205095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqvbpksgmloiacccdogwsmvxvcxmbhkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349130.0979288-1029-110391867049911/AnsiballZ_file.py'
Feb 17 17:25:30 compute-0 sudo[205095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:30 compute-0 python3.9[205098]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:30 compute-0 sudo[205095]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:30 compute-0 sudo[205172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-httdroccipvpzbweacisjjkhvsrutanq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349130.0979288-1029-110391867049911/AnsiballZ_stat.py'
Feb 17 17:25:30 compute-0 sudo[205172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:30 compute-0 python3.9[205175]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:25:30 compute-0 sudo[205172]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:31 compute-0 sudo[205324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqvzjyuvifjhxtlkkiljhntsbnqguqxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349130.9983816-1029-264422529043380/AnsiballZ_copy.py'
Feb 17 17:25:31 compute-0 sudo[205324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:31 compute-0 python3.9[205327]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771349130.9983816-1029-264422529043380/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:31 compute-0 sudo[205324]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:31 compute-0 sudo[205401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgiaepkzoacjkxukjjcqcqnpyqbeyrwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349130.9983816-1029-264422529043380/AnsiballZ_systemd.py'
Feb 17 17:25:31 compute-0 sudo[205401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:32 compute-0 python3.9[205404]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 17 17:25:32 compute-0 systemd[1]: Reloading.
Feb 17 17:25:32 compute-0 systemd-sysv-generator[205435]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:25:32 compute-0 systemd-rc-local-generator[205432]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:25:32 compute-0 sudo[205401]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:32 compute-0 sudo[205519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sauemwrrtjsrplsuslakitbatbofunhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349130.9983816-1029-264422529043380/AnsiballZ_systemd.py'
Feb 17 17:25:32 compute-0 sudo[205519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:32 compute-0 python3.9[205522]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 17 17:25:33 compute-0 systemd[1]: Reloading.
Feb 17 17:25:33 compute-0 systemd-sysv-generator[205550]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 17 17:25:33 compute-0 systemd-rc-local-generator[205547]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 17 17:25:33 compute-0 systemd[1]: Starting openstack_network_exporter container...
Feb 17 17:25:33 compute-0 systemd[1]: Started libcrun container.
Feb 17 17:25:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9708cff14393dfa7a539c3b7e83088b0e3a26e82ccded142d97ac02833466ce0/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 17 17:25:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9708cff14393dfa7a539c3b7e83088b0e3a26e82ccded142d97ac02833466ce0/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Feb 17 17:25:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9708cff14393dfa7a539c3b7e83088b0e3a26e82ccded142d97ac02833466ce0/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 17 17:25:33 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a.
Feb 17 17:25:33 compute-0 podman[205569]: 2026-02-17 17:25:33.432940183 +0000 UTC m=+0.125342182 container init 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, managed_by=edpm_ansible, vcs-type=git, config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7)
Feb 17 17:25:33 compute-0 openstack_network_exporter[205585]: INFO    17:25:33 main.go:48: registering *bridge.Collector
Feb 17 17:25:33 compute-0 openstack_network_exporter[205585]: INFO    17:25:33 main.go:48: registering *coverage.Collector
Feb 17 17:25:33 compute-0 openstack_network_exporter[205585]: INFO    17:25:33 main.go:48: registering *datapath.Collector
Feb 17 17:25:33 compute-0 openstack_network_exporter[205585]: INFO    17:25:33 main.go:48: registering *iface.Collector
Feb 17 17:25:33 compute-0 openstack_network_exporter[205585]: INFO    17:25:33 main.go:48: registering *memory.Collector
Feb 17 17:25:33 compute-0 openstack_network_exporter[205585]: INFO    17:25:33 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Feb 17 17:25:33 compute-0 openstack_network_exporter[205585]: INFO    17:25:33 main.go:48: registering *ovn.Collector
Feb 17 17:25:33 compute-0 openstack_network_exporter[205585]: INFO    17:25:33 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Feb 17 17:25:33 compute-0 openstack_network_exporter[205585]: INFO    17:25:33 main.go:48: registering *pmd_perf.Collector
Feb 17 17:25:33 compute-0 openstack_network_exporter[205585]: INFO    17:25:33 main.go:48: registering *pmd_rxq.Collector
Feb 17 17:25:33 compute-0 openstack_network_exporter[205585]: INFO    17:25:33 main.go:48: registering *vswitch.Collector
Feb 17 17:25:33 compute-0 openstack_network_exporter[205585]: NOTICE  17:25:33 main.go:76: listening on https://:9105/metrics
Feb 17 17:25:33 compute-0 podman[205569]: 2026-02-17 17:25:33.455837565 +0000 UTC m=+0.148239554 container start 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, version=9.7, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, container_name=openstack_network_exporter)
Feb 17 17:25:33 compute-0 podman[205569]: openstack_network_exporter
Feb 17 17:25:33 compute-0 systemd[1]: Started openstack_network_exporter container.
Feb 17 17:25:33 compute-0 sudo[205519]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:33 compute-0 podman[205595]: 2026-02-17 17:25:33.530437913 +0000 UTC m=+0.068530683 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, name=ubi9/ubi-minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z)
Feb 17 17:25:34 compute-0 python3.9[205768]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 17 17:25:34 compute-0 sudo[205918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtrbjnlyuthjlqvnlajcuvfxxnhuuozr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349134.5678854-1074-152802034167016/AnsiballZ_stat.py'
Feb 17 17:25:34 compute-0 sudo[205918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:34 compute-0 python3.9[205921]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:25:34 compute-0 sudo[205918]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:35 compute-0 sudo[206044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlegnjzhllyjlihbqnmxgnphzevubpfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349134.5678854-1074-152802034167016/AnsiballZ_copy.py'
Feb 17 17:25:35 compute-0 sudo[206044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:35 compute-0 python3.9[206047]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771349134.5678854-1074-152802034167016/.source.yaml _original_basename=.0bw2hhdx follow=False checksum=0de68db4ab1742a301935c0684913dc76cf9b759 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:35 compute-0 sudo[206044]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:35 compute-0 sudo[206197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njacxnigjkicxbnqovskuzdndnyyaavh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349135.5395176-1089-238411048515522/AnsiballZ_find.py'
Feb 17 17:25:35 compute-0 sudo[206197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:36 compute-0 python3.9[206200]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 17 17:25:36 compute-0 sudo[206197]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:36 compute-0 sudo[206350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyventnyqhamdxklzzehikqrfciqgwfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349136.460776-1099-247866290787160/AnsiballZ_podman_container_info.py'
Feb 17 17:25:36 compute-0 sudo[206350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:37 compute-0 python3.9[206353]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Feb 17 17:25:37 compute-0 sudo[206350]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:37 compute-0 sudo[206516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozvfhuzcmlfdyahmvmcjwxzejkncshzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349137.2358146-1107-59299239478029/AnsiballZ_podman_container_exec.py'
Feb 17 17:25:37 compute-0 sudo[206516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:37 compute-0 python3.9[206519]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 17 17:25:37 compute-0 systemd[1]: Started libpod-conmon-96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75.scope.
Feb 17 17:25:37 compute-0 podman[206520]: 2026-02-17 17:25:37.864075035 +0000 UTC m=+0.072155410 container exec 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 17 17:25:37 compute-0 podman[206520]: 2026-02-17 17:25:37.903546346 +0000 UTC m=+0.111626631 container exec_died 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Feb 17 17:25:37 compute-0 systemd[1]: libpod-conmon-96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75.scope: Deactivated successfully.
Feb 17 17:25:37 compute-0 sudo[206516]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:38 compute-0 sudo[206701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaypatxlfbaveygaprjjugtssrnostwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349138.1001177-1115-120363118473019/AnsiballZ_podman_container_exec.py'
Feb 17 17:25:38 compute-0 sudo[206701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:38 compute-0 python3.9[206704]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 17 17:25:38 compute-0 systemd[1]: Started libpod-conmon-96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75.scope.
Feb 17 17:25:38 compute-0 podman[206705]: 2026-02-17 17:25:38.678421373 +0000 UTC m=+0.079866175 container exec 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 17 17:25:38 compute-0 podman[206705]: 2026-02-17 17:25:38.712464474 +0000 UTC m=+0.113909176 container exec_died 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 17 17:25:38 compute-0 systemd[1]: libpod-conmon-96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75.scope: Deactivated successfully.
Feb 17 17:25:38 compute-0 sudo[206701]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:39 compute-0 sudo[206884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwjnmvjuzkfukzedvindkcnwkasdvlvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349138.9070911-1123-231864827837107/AnsiballZ_file.py'
Feb 17 17:25:39 compute-0 sudo[206884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:39 compute-0 python3.9[206887]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:39 compute-0 sudo[206884]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:39 compute-0 sudo[207037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnijdfznqjzdnligshwmvfxqwvwomxnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349139.4844067-1132-139084729371082/AnsiballZ_podman_container_info.py'
Feb 17 17:25:39 compute-0 sudo[207037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:39 compute-0 python3.9[207040]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Feb 17 17:25:39 compute-0 sudo[207037]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:40 compute-0 sudo[207203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwhpenknkeusfuirhxuwgokfkjbicmgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349140.1559825-1140-241285782866172/AnsiballZ_podman_container_exec.py'
Feb 17 17:25:40 compute-0 sudo[207203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:40 compute-0 python3.9[207206]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 17 17:25:40 compute-0 systemd[1]: Started libpod-conmon-2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998.scope.
Feb 17 17:25:40 compute-0 podman[207207]: 2026-02-17 17:25:40.702896729 +0000 UTC m=+0.068077692 container exec 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Feb 17 17:25:40 compute-0 podman[207207]: 2026-02-17 17:25:40.73327676 +0000 UTC m=+0.098457733 container exec_died 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Feb 17 17:25:40 compute-0 systemd[1]: libpod-conmon-2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998.scope: Deactivated successfully.
Feb 17 17:25:40 compute-0 sudo[207203]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:41 compute-0 sudo[207388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twblvbjlekkmzhgsslgnvyondqoveqoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349140.9146798-1148-217636764122683/AnsiballZ_podman_container_exec.py'
Feb 17 17:25:41 compute-0 sudo[207388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:41 compute-0 python3.9[207391]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 17 17:25:41 compute-0 systemd[1]: Started libpod-conmon-2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998.scope.
Feb 17 17:25:41 compute-0 podman[207392]: 2026-02-17 17:25:41.465502969 +0000 UTC m=+0.066733059 container exec 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 17 17:25:41 compute-0 podman[207411]: 2026-02-17 17:25:41.525285591 +0000 UTC m=+0.051143605 container exec_died 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 17 17:25:41 compute-0 podman[207392]: 2026-02-17 17:25:41.53062925 +0000 UTC m=+0.131859330 container exec_died 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 17 17:25:41 compute-0 systemd[1]: libpod-conmon-2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998.scope: Deactivated successfully.
Feb 17 17:25:41 compute-0 sudo[207388]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:41 compute-0 sudo[207573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfnotnduzkcgiqgwlqgsdmmoelkjzrwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349141.6860461-1156-73517380847428/AnsiballZ_file.py'
Feb 17 17:25:41 compute-0 sudo[207573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:42 compute-0 python3.9[207576]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:42 compute-0 sudo[207573]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:42 compute-0 sudo[207726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdtekryaqvdekivkhdturgygvzcukuez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349142.3143396-1165-259071852251291/AnsiballZ_podman_container_info.py'
Feb 17 17:25:42 compute-0 sudo[207726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:42 compute-0 python3.9[207729]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Feb 17 17:25:42 compute-0 sudo[207726]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:43 compute-0 sudo[207904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubivxuqmqbhyxeqgqeihlvoanqgcfnnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349142.9404633-1173-124623758110334/AnsiballZ_podman_container_exec.py'
Feb 17 17:25:43 compute-0 sudo[207904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:43 compute-0 podman[207866]: 2026-02-17 17:25:43.189661336 +0000 UTC m=+0.057411744 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 17 17:25:43 compute-0 systemd[1]: f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1-47479e77ee6cda40.service: Main process exited, code=exited, status=1/FAILURE
Feb 17 17:25:43 compute-0 systemd[1]: f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1-47479e77ee6cda40.service: Failed with result 'exit-code'.
Feb 17 17:25:43 compute-0 python3.9[207912]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 17 17:25:43 compute-0 systemd[1]: Started libpod-conmon-f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1.scope.
Feb 17 17:25:43 compute-0 podman[207915]: 2026-02-17 17:25:43.429046616 +0000 UTC m=+0.076904344 container exec f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute)
Feb 17 17:25:43 compute-0 podman[207915]: 2026-02-17 17:25:43.469516122 +0000 UTC m=+0.117373840 container exec_died f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute)
Feb 17 17:25:43 compute-0 systemd[1]: libpod-conmon-f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1.scope: Deactivated successfully.
Feb 17 17:25:43 compute-0 sudo[207904]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:43 compute-0 sudo[208111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyazmeqyzeaozeazuiplaxzlhwpjmpdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349143.6413383-1181-212295625423432/AnsiballZ_podman_container_exec.py'
Feb 17 17:25:43 compute-0 sudo[208111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:43 compute-0 podman[208071]: 2026-02-17 17:25:43.888702785 +0000 UTC m=+0.048358037 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 17 17:25:44 compute-0 python3.9[208120]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 17 17:25:44 compute-0 systemd[1]: Started libpod-conmon-f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1.scope.
Feb 17 17:25:44 compute-0 podman[208121]: 2026-02-17 17:25:44.164891472 +0000 UTC m=+0.085120963 container exec f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127)
Feb 17 17:25:44 compute-0 podman[208121]: 2026-02-17 17:25:44.193987873 +0000 UTC m=+0.114217364 container exec_died f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 17 17:25:44 compute-0 systemd[1]: libpod-conmon-f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1.scope: Deactivated successfully.
Feb 17 17:25:44 compute-0 sudo[208111]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:44 compute-0 sudo[208303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsjewquwwxkzxmonurxjslpjgxfbmiza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349144.366677-1189-189665705789105/AnsiballZ_file.py'
Feb 17 17:25:44 compute-0 sudo[208303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:44 compute-0 python3.9[208306]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:44 compute-0 sudo[208303]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:45 compute-0 sudo[208456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvysawquijarbijgmjovdekctvolwbdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349144.9750504-1198-230279582065947/AnsiballZ_podman_container_info.py'
Feb 17 17:25:45 compute-0 sudo[208456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:45 compute-0 python3.9[208459]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Feb 17 17:25:45 compute-0 sudo[208456]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:45 compute-0 sudo[208622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhyvwnlsfkkknzphmvcnedmlnfqblvsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349145.5829177-1206-142592076360203/AnsiballZ_podman_container_exec.py'
Feb 17 17:25:45 compute-0 sudo[208622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:45 compute-0 python3.9[208625]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 17 17:25:46 compute-0 systemd[1]: Started libpod-conmon-5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12.scope.
Feb 17 17:25:46 compute-0 podman[208626]: 2026-02-17 17:25:46.076950248 +0000 UTC m=+0.065210604 container exec 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 17 17:25:46 compute-0 podman[208646]: 2026-02-17 17:25:46.140300924 +0000 UTC m=+0.053918750 container exec_died 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 17 17:25:46 compute-0 podman[208626]: 2026-02-17 17:25:46.147112219 +0000 UTC m=+0.135372565 container exec_died 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 17 17:25:46 compute-0 systemd[1]: libpod-conmon-5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12.scope: Deactivated successfully.
Feb 17 17:25:46 compute-0 sudo[208622]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:46 compute-0 sudo[208819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwbkxglhborndiifhqpfxeovdjiseeos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349146.3164027-1214-166109016765788/AnsiballZ_podman_container_exec.py'
Feb 17 17:25:46 compute-0 sudo[208819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:46 compute-0 podman[208782]: 2026-02-17 17:25:46.622720982 +0000 UTC m=+0.073919903 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 17 17:25:46 compute-0 python3.9[208822]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 17 17:25:46 compute-0 systemd[1]: Started libpod-conmon-5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12.scope.
Feb 17 17:25:46 compute-0 podman[208835]: 2026-02-17 17:25:46.916334158 +0000 UTC m=+0.089587970 container exec 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 17 17:25:46 compute-0 podman[208835]: 2026-02-17 17:25:46.947524521 +0000 UTC m=+0.120778323 container exec_died 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 17 17:25:46 compute-0 systemd[1]: libpod-conmon-5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12.scope: Deactivated successfully.
Feb 17 17:25:46 compute-0 sudo[208819]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:47 compute-0 sudo[209017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrsvvrgjetfohzlzqdstcpnlyeedpjdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349147.1747472-1222-240442832442649/AnsiballZ_file.py'
Feb 17 17:25:47 compute-0 sudo[209017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:47 compute-0 python3.9[209020]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:47 compute-0 sudo[209017]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:48 compute-0 sudo[209170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heehxmnfvmiuzujxzxgijxmukoxeuuqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349147.827447-1231-128486013700352/AnsiballZ_podman_container_info.py'
Feb 17 17:25:48 compute-0 sudo[209170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:48 compute-0 python3.9[209173]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Feb 17 17:25:48 compute-0 sudo[209170]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:48 compute-0 sudo[209336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymaykbhvezwwojyuljouzdhkyamnbsbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349148.44259-1239-22710612802646/AnsiballZ_podman_container_exec.py'
Feb 17 17:25:48 compute-0 sudo[209336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:49 compute-0 python3.9[209339]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 17 17:25:49 compute-0 systemd[1]: Started libpod-conmon-9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6.scope.
Feb 17 17:25:49 compute-0 podman[209340]: 2026-02-17 17:25:49.126580282 +0000 UTC m=+0.085991865 container exec 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 17 17:25:49 compute-0 podman[209340]: 2026-02-17 17:25:49.156115733 +0000 UTC m=+0.115527286 container exec_died 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 17 17:25:49 compute-0 systemd[1]: libpod-conmon-9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6.scope: Deactivated successfully.
Feb 17 17:25:49 compute-0 sudo[209336]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:49 compute-0 sudo[209522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbjyngbbyyfqqqvrrybbjgmdxjxhwyuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349149.3874848-1247-86275582858144/AnsiballZ_podman_container_exec.py'
Feb 17 17:25:49 compute-0 sudo[209522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:49 compute-0 python3.9[209525]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 17 17:25:49 compute-0 systemd[1]: Started libpod-conmon-9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6.scope.
Feb 17 17:25:49 compute-0 podman[209526]: 2026-02-17 17:25:49.891102159 +0000 UTC m=+0.071849193 container exec 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 17 17:25:49 compute-0 podman[209526]: 2026-02-17 17:25:49.921578123 +0000 UTC m=+0.102325117 container exec_died 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 17 17:25:49 compute-0 systemd[1]: libpod-conmon-9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6.scope: Deactivated successfully.
Feb 17 17:25:49 compute-0 sudo[209522]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:50 compute-0 sudo[209705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phgjlynhjbmchlruepqlpnqvqchwmtos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349150.099579-1255-79366666278439/AnsiballZ_file.py'
Feb 17 17:25:50 compute-0 sudo[209705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:50 compute-0 python3.9[209708]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:50 compute-0 sudo[209705]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:50 compute-0 sudo[209858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omivfmvhoitvmymdqrvhficmvveiedzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349150.731487-1264-47262283927881/AnsiballZ_podman_container_info.py'
Feb 17 17:25:50 compute-0 sudo[209858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:51 compute-0 python3.9[209861]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Feb 17 17:25:51 compute-0 sudo[209858]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:51 compute-0 sudo[210024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrirlqkzvrqppqfpbyfcqycpowbshijj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349151.407307-1272-153157084044917/AnsiballZ_podman_container_exec.py'
Feb 17 17:25:51 compute-0 sudo[210024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:51 compute-0 python3.9[210027]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 17 17:25:51 compute-0 systemd[1]: Started libpod-conmon-932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a.scope.
Feb 17 17:25:51 compute-0 podman[210028]: 2026-02-17 17:25:51.914858586 +0000 UTC m=+0.062853346 container exec 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, release=1770267347, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, container_name=openstack_network_exporter, distribution-scope=public, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git)
Feb 17 17:25:51 compute-0 podman[210028]: 2026-02-17 17:25:51.944407798 +0000 UTC m=+0.092402598 container exec_died 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.7, architecture=x86_64, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9)
Feb 17 17:25:51 compute-0 systemd[1]: libpod-conmon-932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a.scope: Deactivated successfully.
Feb 17 17:25:51 compute-0 sudo[210024]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:52 compute-0 sudo[210210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqkxlvheefkkzoxionzafmvadresezra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349152.1134079-1280-127929986461519/AnsiballZ_podman_container_exec.py'
Feb 17 17:25:52 compute-0 sudo[210210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:52 compute-0 python3.9[210213]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 17 17:25:52 compute-0 systemd[1]: Started libpod-conmon-932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a.scope.
Feb 17 17:25:52 compute-0 podman[210214]: 2026-02-17 17:25:52.636006768 +0000 UTC m=+0.077513429 container exec 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-type=git, architecture=x86_64, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, distribution-scope=public, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 17 17:25:52 compute-0 podman[210214]: 2026-02-17 17:25:52.667020125 +0000 UTC m=+0.108526726 container exec_died 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, release=1770267347, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=)
Feb 17 17:25:52 compute-0 systemd[1]: libpod-conmon-932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a.scope: Deactivated successfully.
Feb 17 17:25:52 compute-0 sudo[210210]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:53 compute-0 sudo[210397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twjuscftikvgtvirafwuijzsitghcihb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349152.8766446-1288-273329169982737/AnsiballZ_file.py'
Feb 17 17:25:53 compute-0 sudo[210397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:53 compute-0 python3.9[210400]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:53 compute-0 sudo[210397]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:53 compute-0 nova_compute[186479]: 2026-02-17 17:25:53.623 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:25:53 compute-0 nova_compute[186479]: 2026-02-17 17:25:53.645 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:25:53 compute-0 nova_compute[186479]: 2026-02-17 17:25:53.645 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:25:53 compute-0 sudo[210550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnufiecltcgpyctdzsvpooqaxoxpyqwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349153.5305855-1297-132357976000616/AnsiballZ_file.py'
Feb 17 17:25:53 compute-0 sudo[210550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:53 compute-0 python3.9[210553]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:54 compute-0 sudo[210550]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:54 compute-0 nova_compute[186479]: 2026-02-17 17:25:54.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:25:54 compute-0 nova_compute[186479]: 2026-02-17 17:25:54.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:25:54 compute-0 nova_compute[186479]: 2026-02-17 17:25:54.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:25:54 compute-0 nova_compute[186479]: 2026-02-17 17:25:54.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:25:54 compute-0 nova_compute[186479]: 2026-02-17 17:25:54.326 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:25:54 compute-0 nova_compute[186479]: 2026-02-17 17:25:54.326 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:25:54 compute-0 nova_compute[186479]: 2026-02-17 17:25:54.326 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:25:54 compute-0 nova_compute[186479]: 2026-02-17 17:25:54.326 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:25:54 compute-0 sudo[210703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czkwezpmqbgndrrlrugnllqwlhenkrxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349154.148123-1305-262067599124874/AnsiballZ_stat.py'
Feb 17 17:25:54 compute-0 sudo[210703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:54 compute-0 nova_compute[186479]: 2026-02-17 17:25:54.467 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:25:54 compute-0 nova_compute[186479]: 2026-02-17 17:25:54.468 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5863MB free_disk=73.20986938476562GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:25:54 compute-0 nova_compute[186479]: 2026-02-17 17:25:54.468 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:25:54 compute-0 nova_compute[186479]: 2026-02-17 17:25:54.468 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:25:54 compute-0 nova_compute[186479]: 2026-02-17 17:25:54.518 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:25:54 compute-0 nova_compute[186479]: 2026-02-17 17:25:54.519 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:25:54 compute-0 nova_compute[186479]: 2026-02-17 17:25:54.538 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:25:54 compute-0 python3.9[210706]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:25:54 compute-0 nova_compute[186479]: 2026-02-17 17:25:54.564 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:25:54 compute-0 nova_compute[186479]: 2026-02-17 17:25:54.567 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:25:54 compute-0 nova_compute[186479]: 2026-02-17 17:25:54.567 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:25:54 compute-0 sudo[210703]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:54 compute-0 sudo[210827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckhkakmgyggrwkoeigldxeozgxszjxja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349154.148123-1305-262067599124874/AnsiballZ_copy.py'
Feb 17 17:25:54 compute-0 sudo[210827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:55 compute-0 python3.9[210830]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771349154.148123-1305-262067599124874/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:55 compute-0 sudo[210827]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:55 compute-0 sudo[210980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmrhmyplekfdycevtzeenlchsdsnmnaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349155.3069527-1321-261198948775789/AnsiballZ_file.py'
Feb 17 17:25:55 compute-0 sudo[210980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:55 compute-0 nova_compute[186479]: 2026-02-17 17:25:55.567 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:25:55 compute-0 nova_compute[186479]: 2026-02-17 17:25:55.567 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:25:55 compute-0 nova_compute[186479]: 2026-02-17 17:25:55.567 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 17 17:25:55 compute-0 nova_compute[186479]: 2026-02-17 17:25:55.587 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 17 17:25:55 compute-0 nova_compute[186479]: 2026-02-17 17:25:55.588 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:25:55 compute-0 nova_compute[186479]: 2026-02-17 17:25:55.588 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:25:55 compute-0 nova_compute[186479]: 2026-02-17 17:25:55.588 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:25:55 compute-0 python3.9[210983]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:55 compute-0 sudo[210980]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:56 compute-0 sudo[211143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alihiulwdifxmnuffxkwhhvmxcenotou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349155.8575983-1329-173497941552577/AnsiballZ_stat.py'
Feb 17 17:25:56 compute-0 sudo[211143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:56 compute-0 podman[211107]: 2026-02-17 17:25:56.144693647 +0000 UTC m=+0.090402780 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Feb 17 17:25:56 compute-0 python3.9[211150]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:25:56 compute-0 sudo[211143]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:56 compute-0 sudo[211239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mchgppiasqnnmurzjljwfozrbykoahei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349155.8575983-1329-173497941552577/AnsiballZ_file.py'
Feb 17 17:25:56 compute-0 sudo[211239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:56 compute-0 python3.9[211242]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:56 compute-0 sudo[211239]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:57 compute-0 sudo[211392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyuqklejbuanxzybvqbpxtqlmkjlebgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349156.8631856-1341-210221057890715/AnsiballZ_stat.py'
Feb 17 17:25:57 compute-0 sudo[211392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:57 compute-0 python3.9[211395]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:25:57 compute-0 sudo[211392]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:57 compute-0 sudo[211471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hohnfoxrfsqivbkorisptyfrndackros ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349156.8631856-1341-210221057890715/AnsiballZ_file.py'
Feb 17 17:25:57 compute-0 sudo[211471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:57 compute-0 python3.9[211474]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.7rufhavq recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:57 compute-0 sudo[211471]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:58 compute-0 sudo[211639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybyzmqkofadlqdsfyamhckgynkxtaaik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349157.8618205-1353-44066119714739/AnsiballZ_stat.py'
Feb 17 17:25:58 compute-0 sudo[211639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:58 compute-0 podman[211600]: 2026-02-17 17:25:58.129524497 +0000 UTC m=+0.067998731 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 17 17:25:58 compute-0 sshd-session[211499]: Invalid user admin from 209.38.233.161 port 57228
Feb 17 17:25:58 compute-0 sshd-session[211499]: Connection closed by invalid user admin 209.38.233.161 port 57228 [preauth]
Feb 17 17:25:58 compute-0 python3.9[211653]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:25:58 compute-0 sudo[211639]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:58 compute-0 sudo[211729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddxkgajeamxknpgtjxfbakbnolvhqato ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349157.8618205-1353-44066119714739/AnsiballZ_file.py'
Feb 17 17:25:58 compute-0 sudo[211729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:58 compute-0 python3.9[211732]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:25:58 compute-0 sudo[211729]: pam_unix(sudo:session): session closed for user root
Feb 17 17:25:59 compute-0 sudo[211882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohjjdwptjdugrznrwupqowfxqqlbfnbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349158.889443-1366-40671407983909/AnsiballZ_command.py'
Feb 17 17:25:59 compute-0 sudo[211882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:25:59 compute-0 python3.9[211885]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:25:59 compute-0 sudo[211882]: pam_unix(sudo:session): session closed for user root
Feb 17 17:26:00 compute-0 sudo[212038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aysbrjylfqezglkoarwhudapxlqxpfxy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771349159.9059505-1374-248486359904704/AnsiballZ_edpm_nftables_from_files.py'
Feb 17 17:26:00 compute-0 sudo[212038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:26:00 compute-0 python3[212041]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 17 17:26:00 compute-0 sudo[212038]: pam_unix(sudo:session): session closed for user root
Feb 17 17:26:01 compute-0 sudo[212191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtmcctszztxauzxhggtlxsmfrhikukod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349160.8481486-1382-159860244504760/AnsiballZ_stat.py'
Feb 17 17:26:01 compute-0 sudo[212191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:26:01 compute-0 python3.9[212194]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:26:01 compute-0 sudo[212191]: pam_unix(sudo:session): session closed for user root
Feb 17 17:26:01 compute-0 sudo[212270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjhotgsxwquyebakbbwdmrusgnxvpzey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349160.8481486-1382-159860244504760/AnsiballZ_file.py'
Feb 17 17:26:01 compute-0 sudo[212270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:26:01 compute-0 python3.9[212273]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:26:01 compute-0 sudo[212270]: pam_unix(sudo:session): session closed for user root
Feb 17 17:26:02 compute-0 sudo[212423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwipwezvpjouodlikrklhphgxvhrkxsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349161.8395708-1394-35848239443569/AnsiballZ_stat.py'
Feb 17 17:26:02 compute-0 sudo[212423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:26:02 compute-0 python3.9[212426]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:26:02 compute-0 sudo[212423]: pam_unix(sudo:session): session closed for user root
Feb 17 17:26:02 compute-0 sudo[212502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jednzktkworpldgmnnmiaigxhcxliyti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349161.8395708-1394-35848239443569/AnsiballZ_file.py'
Feb 17 17:26:02 compute-0 sudo[212502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:26:02 compute-0 python3.9[212505]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:26:02 compute-0 sudo[212502]: pam_unix(sudo:session): session closed for user root
Feb 17 17:26:03 compute-0 sudo[212655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjrmrdgrxodprpczdtlxesswdkchrnfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349162.8716688-1406-55588665347654/AnsiballZ_stat.py'
Feb 17 17:26:03 compute-0 sudo[212655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:26:03 compute-0 python3.9[212658]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:26:03 compute-0 sudo[212655]: pam_unix(sudo:session): session closed for user root
Feb 17 17:26:03 compute-0 sudo[212734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpwjaemuppkdyswbenxroqczipuvvawm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349162.8716688-1406-55588665347654/AnsiballZ_file.py'
Feb 17 17:26:03 compute-0 sudo[212734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:26:03 compute-0 podman[212736]: 2026-02-17 17:26:03.628932736 +0000 UTC m=+0.061075054 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2026-02-05T04:57:10Z)
Feb 17 17:26:03 compute-0 python3.9[212738]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:26:03 compute-0 sudo[212734]: pam_unix(sudo:session): session closed for user root
Feb 17 17:26:04 compute-0 sudo[212908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aajkuevlsooxxcussjymdykdqxpqwtmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349163.9251952-1418-157674884508068/AnsiballZ_stat.py'
Feb 17 17:26:04 compute-0 sudo[212908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:26:04 compute-0 python3.9[212911]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:26:04 compute-0 sudo[212908]: pam_unix(sudo:session): session closed for user root
Feb 17 17:26:04 compute-0 sudo[212987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azelvykcgrruxlvupqewnrppfqimmrcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349163.9251952-1418-157674884508068/AnsiballZ_file.py'
Feb 17 17:26:04 compute-0 sudo[212987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:26:04 compute-0 python3.9[212990]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:26:04 compute-0 sudo[212987]: pam_unix(sudo:session): session closed for user root
Feb 17 17:26:05 compute-0 sudo[213140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jonvrdpgdavcsnudnveyevtfoelnucrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349164.9271207-1430-230493101282577/AnsiballZ_stat.py'
Feb 17 17:26:05 compute-0 sudo[213140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:26:05 compute-0 python3.9[213143]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 17 17:26:05 compute-0 sudo[213140]: pam_unix(sudo:session): session closed for user root
Feb 17 17:26:05 compute-0 sudo[213266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpfqdudpbpvxfcfphcdedssrrfrelwis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349164.9271207-1430-230493101282577/AnsiballZ_copy.py'
Feb 17 17:26:05 compute-0 sudo[213266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:26:05 compute-0 python3.9[213269]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771349164.9271207-1430-230493101282577/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:26:06 compute-0 sudo[213266]: pam_unix(sudo:session): session closed for user root
Feb 17 17:26:06 compute-0 sudo[213419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtpfqglfetkucqhyddhrdnlofhrplake ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349166.17111-1445-61309582187485/AnsiballZ_file.py'
Feb 17 17:26:06 compute-0 sudo[213419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:26:06 compute-0 python3.9[213422]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:26:06 compute-0 sudo[213419]: pam_unix(sudo:session): session closed for user root
Feb 17 17:26:07 compute-0 sudo[213572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msfoybhgiascunnrmlybmkhtzpgdaffi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349166.8735018-1453-267391711352523/AnsiballZ_command.py'
Feb 17 17:26:07 compute-0 sudo[213572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:26:07 compute-0 python3.9[213575]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:26:07 compute-0 sudo[213572]: pam_unix(sudo:session): session closed for user root
Feb 17 17:26:07 compute-0 sudo[213728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jguahdnmtiboslimmgydfokxpynhidlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349167.513398-1461-253873636568642/AnsiballZ_blockinfile.py'
Feb 17 17:26:07 compute-0 sudo[213728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:26:08 compute-0 python3.9[213731]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:26:08 compute-0 sudo[213728]: pam_unix(sudo:session): session closed for user root
Feb 17 17:26:08 compute-0 sudo[213881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhnbhnwdfrusefrkzpjmpjzkqjbgzmzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349168.3474395-1470-190337279298533/AnsiballZ_command.py'
Feb 17 17:26:08 compute-0 sudo[213881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:26:08 compute-0 python3.9[213884]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:26:08 compute-0 sudo[213881]: pam_unix(sudo:session): session closed for user root
Feb 17 17:26:09 compute-0 sudo[214035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wytufcllnedownppjyjytjdlbzudasca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349169.0000224-1478-206054596357612/AnsiballZ_stat.py'
Feb 17 17:26:09 compute-0 sudo[214035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:26:09 compute-0 python3.9[214038]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 17 17:26:09 compute-0 sudo[214035]: pam_unix(sudo:session): session closed for user root
Feb 17 17:26:09 compute-0 sudo[214190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftmqygebyluoubzveemgrxslcawjagvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349169.5787654-1486-238864515729035/AnsiballZ_command.py'
Feb 17 17:26:09 compute-0 sudo[214190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:26:09 compute-0 python3.9[214193]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 17 17:26:10 compute-0 sudo[214190]: pam_unix(sudo:session): session closed for user root
Feb 17 17:26:10 compute-0 sudo[214346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymzretehuqciiutcgzknaxxwqfobclog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771349170.195536-1494-78666933888991/AnsiballZ_file.py'
Feb 17 17:26:10 compute-0 sudo[214346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:26:10 compute-0 python3.9[214349]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 17 17:26:10 compute-0 sudo[214346]: pam_unix(sudo:session): session closed for user root
Feb 17 17:26:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:26:10.942 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:26:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:26:10.943 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:26:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:26:10.944 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:26:10 compute-0 sshd-session[186808]: Connection closed by 192.168.122.30 port 59420
Feb 17 17:26:10 compute-0 sshd-session[186805]: pam_unix(sshd:session): session closed for user zuul
Feb 17 17:26:10 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Feb 17 17:26:11 compute-0 systemd[1]: session-25.scope: Consumed 1min 31.399s CPU time.
Feb 17 17:26:11 compute-0 systemd-logind[806]: Session 25 logged out. Waiting for processes to exit.
Feb 17 17:26:11 compute-0 systemd-logind[806]: Removed session 25.
Feb 17 17:26:13 compute-0 podman[214374]: 2026-02-17 17:26:13.72994853 +0000 UTC m=+0.061265715 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 17 17:26:14 compute-0 podman[214394]: 2026-02-17 17:26:14.725359426 +0000 UTC m=+0.059882831 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 17 17:26:16 compute-0 podman[214414]: 2026-02-17 17:26:16.720088545 +0000 UTC m=+0.066632207 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 17 17:26:26 compute-0 podman[214438]: 2026-02-17 17:26:26.744310505 +0000 UTC m=+0.092281396 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 17 17:26:28 compute-0 podman[214464]: 2026-02-17 17:26:28.720776285 +0000 UTC m=+0.061414518 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 17 17:26:34 compute-0 podman[214487]: 2026-02-17 17:26:34.701930475 +0000 UTC m=+0.048386478 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:26:43.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:26:44 compute-0 podman[214508]: 2026-02-17 17:26:44.723207396 +0000 UTC m=+0.066755470 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 17 17:26:44 compute-0 podman[214528]: 2026-02-17 17:26:44.802952863 +0000 UTC m=+0.051595148 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible)
Feb 17 17:26:47 compute-0 podman[214549]: 2026-02-17 17:26:47.716353782 +0000 UTC m=+0.060381263 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 17 17:26:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:26:51.391 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '7a:bf:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:a1:c8:7d:f2:63'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:26:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:26:51.392 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 17 17:26:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:26:51.393 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ee0cee2f-3200-4f1f-8903-57b18789347d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:26:54 compute-0 nova_compute[186479]: 2026-02-17 17:26:54.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:26:55 compute-0 nova_compute[186479]: 2026-02-17 17:26:55.298 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:26:55 compute-0 nova_compute[186479]: 2026-02-17 17:26:55.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:26:55 compute-0 nova_compute[186479]: 2026-02-17 17:26:55.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:26:55 compute-0 nova_compute[186479]: 2026-02-17 17:26:55.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:26:55 compute-0 nova_compute[186479]: 2026-02-17 17:26:55.303 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:26:56 compute-0 nova_compute[186479]: 2026-02-17 17:26:56.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:26:56 compute-0 nova_compute[186479]: 2026-02-17 17:26:56.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:26:56 compute-0 nova_compute[186479]: 2026-02-17 17:26:56.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 17 17:26:56 compute-0 nova_compute[186479]: 2026-02-17 17:26:56.318 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 17 17:26:56 compute-0 nova_compute[186479]: 2026-02-17 17:26:56.318 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:26:56 compute-0 nova_compute[186479]: 2026-02-17 17:26:56.318 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:26:56 compute-0 nova_compute[186479]: 2026-02-17 17:26:56.318 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:26:56 compute-0 nova_compute[186479]: 2026-02-17 17:26:56.344 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:26:56 compute-0 nova_compute[186479]: 2026-02-17 17:26:56.345 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:26:56 compute-0 nova_compute[186479]: 2026-02-17 17:26:56.345 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:26:56 compute-0 nova_compute[186479]: 2026-02-17 17:26:56.346 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:26:56 compute-0 nova_compute[186479]: 2026-02-17 17:26:56.482 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:26:56 compute-0 nova_compute[186479]: 2026-02-17 17:26:56.483 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6004MB free_disk=73.24373245239258GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:26:56 compute-0 nova_compute[186479]: 2026-02-17 17:26:56.483 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:26:56 compute-0 nova_compute[186479]: 2026-02-17 17:26:56.484 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:26:56 compute-0 nova_compute[186479]: 2026-02-17 17:26:56.539 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:26:56 compute-0 nova_compute[186479]: 2026-02-17 17:26:56.540 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:26:56 compute-0 nova_compute[186479]: 2026-02-17 17:26:56.563 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:26:56 compute-0 nova_compute[186479]: 2026-02-17 17:26:56.576 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:26:56 compute-0 nova_compute[186479]: 2026-02-17 17:26:56.577 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:26:56 compute-0 nova_compute[186479]: 2026-02-17 17:26:56.578 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:26:57 compute-0 podman[214573]: 2026-02-17 17:26:57.760131656 +0000 UTC m=+0.101676997 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 17 17:26:59 compute-0 podman[214599]: 2026-02-17 17:26:59.706946907 +0000 UTC m=+0.051256428 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 17 17:27:05 compute-0 podman[214623]: 2026-02-17 17:27:05.73019004 +0000 UTC m=+0.077834620 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, distribution-scope=public)
Feb 17 17:27:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:27:10.943 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:27:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:27:10.943 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:27:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:27:10.944 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:27:14 compute-0 sshd-session[214644]: Invalid user admin from 209.38.233.161 port 59966
Feb 17 17:27:15 compute-0 podman[214646]: 2026-02-17 17:27:15.047933442 +0000 UTC m=+0.049671618 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Feb 17 17:27:15 compute-0 podman[214647]: 2026-02-17 17:27:15.053223381 +0000 UTC m=+0.052009910 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 17 17:27:15 compute-0 sshd-session[214644]: Connection closed by invalid user admin 209.38.233.161 port 59966 [preauth]
Feb 17 17:27:18 compute-0 podman[214684]: 2026-02-17 17:27:18.714830136 +0000 UTC m=+0.052998495 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 17 17:27:28 compute-0 podman[214708]: 2026-02-17 17:27:28.727144389 +0000 UTC m=+0.072611078 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:27:30 compute-0 podman[214735]: 2026-02-17 17:27:30.696563589 +0000 UTC m=+0.041403044 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 17 17:27:36 compute-0 podman[214760]: 2026-02-17 17:27:36.694747675 +0000 UTC m=+0.042797899 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, version=9.7, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc.)
Feb 17 17:27:45 compute-0 podman[214782]: 2026-02-17 17:27:45.708601993 +0000 UTC m=+0.050881479 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 17 17:27:45 compute-0 podman[214783]: 2026-02-17 17:27:45.734778618 +0000 UTC m=+0.075301749 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 17 17:27:49 compute-0 podman[214822]: 2026-02-17 17:27:49.696850353 +0000 UTC m=+0.045428799 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 17 17:27:55 compute-0 nova_compute[186479]: 2026-02-17 17:27:55.562 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:27:55 compute-0 nova_compute[186479]: 2026-02-17 17:27:55.563 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:27:55 compute-0 nova_compute[186479]: 2026-02-17 17:27:55.563 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:27:56 compute-0 nova_compute[186479]: 2026-02-17 17:27:56.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:27:56 compute-0 nova_compute[186479]: 2026-02-17 17:27:56.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:27:56 compute-0 nova_compute[186479]: 2026-02-17 17:27:56.305 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:27:57 compute-0 nova_compute[186479]: 2026-02-17 17:27:57.299 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:27:57 compute-0 nova_compute[186479]: 2026-02-17 17:27:57.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:27:57 compute-0 nova_compute[186479]: 2026-02-17 17:27:57.303 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:27:57 compute-0 nova_compute[186479]: 2026-02-17 17:27:57.303 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 17 17:27:57 compute-0 nova_compute[186479]: 2026-02-17 17:27:57.336 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 17 17:27:57 compute-0 nova_compute[186479]: 2026-02-17 17:27:57.336 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:27:57 compute-0 nova_compute[186479]: 2026-02-17 17:27:57.373 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:27:57 compute-0 nova_compute[186479]: 2026-02-17 17:27:57.373 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:27:57 compute-0 nova_compute[186479]: 2026-02-17 17:27:57.374 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:27:57 compute-0 nova_compute[186479]: 2026-02-17 17:27:57.374 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:27:57 compute-0 nova_compute[186479]: 2026-02-17 17:27:57.504 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:27:57 compute-0 nova_compute[186479]: 2026-02-17 17:27:57.505 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6063MB free_disk=73.24375915527344GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:27:57 compute-0 nova_compute[186479]: 2026-02-17 17:27:57.505 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:27:57 compute-0 nova_compute[186479]: 2026-02-17 17:27:57.505 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:27:57 compute-0 nova_compute[186479]: 2026-02-17 17:27:57.573 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:27:57 compute-0 nova_compute[186479]: 2026-02-17 17:27:57.574 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:27:57 compute-0 nova_compute[186479]: 2026-02-17 17:27:57.614 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:27:57 compute-0 nova_compute[186479]: 2026-02-17 17:27:57.631 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:27:57 compute-0 nova_compute[186479]: 2026-02-17 17:27:57.634 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:27:57 compute-0 nova_compute[186479]: 2026-02-17 17:27:57.635 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:27:58 compute-0 nova_compute[186479]: 2026-02-17 17:27:58.602 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:27:58 compute-0 nova_compute[186479]: 2026-02-17 17:27:58.616 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:27:59 compute-0 podman[214847]: 2026-02-17 17:27:59.741832288 +0000 UTC m=+0.082789033 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller)
Feb 17 17:28:01 compute-0 podman[214873]: 2026-02-17 17:28:01.721187747 +0000 UTC m=+0.068977383 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 17 17:28:07 compute-0 podman[214897]: 2026-02-17 17:28:07.703690154 +0000 UTC m=+0.049754810 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, release=1770267347, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, name=ubi9/ubi-minimal, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 17 17:28:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:10.944 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:28:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:10.944 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:28:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:10.945 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:28:16 compute-0 podman[214918]: 2026-02-17 17:28:16.711106916 +0000 UTC m=+0.052872283 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 17 17:28:16 compute-0 podman[214919]: 2026-02-17 17:28:16.727309958 +0000 UTC m=+0.063990322 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 17 17:28:20 compute-0 podman[214955]: 2026-02-17 17:28:20.708006251 +0000 UTC m=+0.053723507 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 17 17:28:28 compute-0 sshd-session[214979]: Invalid user admin from 209.38.233.161 port 37064
Feb 17 17:28:28 compute-0 sshd-session[214979]: Connection closed by invalid user admin 209.38.233.161 port 37064 [preauth]
Feb 17 17:28:30 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:30.320 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '7a:bf:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:a1:c8:7d:f2:63'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:28:30 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:30.322 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 17 17:28:30 compute-0 podman[214981]: 2026-02-17 17:28:30.409699961 +0000 UTC m=+0.066535039 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:28:32 compute-0 podman[215008]: 2026-02-17 17:28:32.716241498 +0000 UTC m=+0.049458306 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 17 17:28:33 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:33.325 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ee0cee2f-3200-4f1f-8903-57b18789347d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:28:38 compute-0 podman[215032]: 2026-02-17 17:28:38.735939716 +0000 UTC m=+0.080360639 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., version=9.7, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, release=1770267347, vendor=Red Hat, Inc., container_name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:28:43.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:28:47 compute-0 podman[215054]: 2026-02-17 17:28:47.732752413 +0000 UTC m=+0.070501742 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 17 17:28:47 compute-0 podman[215053]: 2026-02-17 17:28:47.732829405 +0000 UTC m=+0.077392200 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.342 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.343 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.365 186483 DEBUG nova.compute.manager [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.480 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.481 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.489 186483 DEBUG nova.virt.hardware [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.490 186483 INFO nova.compute.claims [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Claim successful on node compute-0.ctlplane.example.com
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.597 186483 DEBUG nova.compute.provider_tree [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.612 186483 DEBUG nova.scheduler.client.report [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.637 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.638 186483 DEBUG nova.compute.manager [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.687 186483 DEBUG nova.compute.manager [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.688 186483 DEBUG nova.network.neutron [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.713 186483 INFO nova.virt.libvirt.driver [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.735 186483 DEBUG nova.compute.manager [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.827 186483 DEBUG nova.compute.manager [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.829 186483 DEBUG nova.virt.libvirt.driver [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.829 186483 INFO nova.virt.libvirt.driver [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Creating image(s)
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.830 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "/var/lib/nova/instances/8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.831 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.832 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.832 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:28:48 compute-0 nova_compute[186479]: 2026-02-17 17:28:48.833 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:28:49 compute-0 nova_compute[186479]: 2026-02-17 17:28:49.256 186483 WARNING oslo_policy.policy [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Feb 17 17:28:49 compute-0 nova_compute[186479]: 2026-02-17 17:28:49.257 186483 WARNING oslo_policy.policy [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Feb 17 17:28:49 compute-0 nova_compute[186479]: 2026-02-17 17:28:49.260 186483 DEBUG nova.policy [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 17 17:28:50 compute-0 nova_compute[186479]: 2026-02-17 17:28:50.232 186483 DEBUG oslo_concurrency.processutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:28:50 compute-0 nova_compute[186479]: 2026-02-17 17:28:50.288 186483 DEBUG oslo_concurrency.processutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2.part --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:28:50 compute-0 nova_compute[186479]: 2026-02-17 17:28:50.290 186483 DEBUG nova.virt.images [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] 4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Feb 17 17:28:50 compute-0 nova_compute[186479]: 2026-02-17 17:28:50.293 186483 DEBUG nova.privsep.utils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Feb 17 17:28:50 compute-0 nova_compute[186479]: 2026-02-17 17:28:50.293 186483 DEBUG oslo_concurrency.processutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2.part /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:28:50 compute-0 nova_compute[186479]: 2026-02-17 17:28:50.453 186483 DEBUG oslo_concurrency.processutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2.part /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2.converted" returned: 0 in 0.160s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:28:50 compute-0 nova_compute[186479]: 2026-02-17 17:28:50.456 186483 DEBUG oslo_concurrency.processutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:28:50 compute-0 nova_compute[186479]: 2026-02-17 17:28:50.508 186483 DEBUG oslo_concurrency.processutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2.converted --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:28:50 compute-0 nova_compute[186479]: 2026-02-17 17:28:50.509 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.676s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:28:50 compute-0 nova_compute[186479]: 2026-02-17 17:28:50.523 186483 INFO oslo.privsep.daemon [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmps60n2krr/privsep.sock']
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.186 186483 INFO oslo.privsep.daemon [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Spawned new privsep daemon via rootwrap
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.034 215112 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.038 215112 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.040 215112 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.040 215112 INFO oslo.privsep.daemon [-] privsep daemon running as pid 215112
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.267 186483 DEBUG oslo_concurrency.processutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.324 186483 DEBUG nova.network.neutron [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Successfully created port: 32811792-c388-4fbf-ae6d-43c1e4b213bf _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.328 186483 DEBUG oslo_concurrency.processutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.329 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.329 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.342 186483 DEBUG oslo_concurrency.processutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.402 186483 DEBUG oslo_concurrency.processutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.408 186483 DEBUG oslo_concurrency.processutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.440 186483 DEBUG oslo_concurrency.processutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.441 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.442 186483 DEBUG oslo_concurrency.processutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.491 186483 DEBUG oslo_concurrency.processutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.492 186483 DEBUG nova.virt.disk.api [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Checking if we can resize image /var/lib/nova/instances/8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.493 186483 DEBUG oslo_concurrency.processutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.564 186483 DEBUG oslo_concurrency.processutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.565 186483 DEBUG nova.virt.disk.api [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Cannot resize image /var/lib/nova/instances/8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.566 186483 DEBUG nova.objects.instance [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'migration_context' on Instance uuid 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.581 186483 DEBUG nova.virt.libvirt.driver [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.581 186483 DEBUG nova.virt.libvirt.driver [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Ensure instance console log exists: /var/lib/nova/instances/8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.581 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.582 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:28:51 compute-0 nova_compute[186479]: 2026-02-17 17:28:51.582 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:28:51 compute-0 podman[215129]: 2026-02-17 17:28:51.705096796 +0000 UTC m=+0.042442083 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 17 17:28:53 compute-0 nova_compute[186479]: 2026-02-17 17:28:53.268 186483 DEBUG nova.network.neutron [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Successfully updated port: 32811792-c388-4fbf-ae6d-43c1e4b213bf _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 17 17:28:53 compute-0 nova_compute[186479]: 2026-02-17 17:28:53.286 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "refresh_cache-8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:28:53 compute-0 nova_compute[186479]: 2026-02-17 17:28:53.286 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquired lock "refresh_cache-8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:28:53 compute-0 nova_compute[186479]: 2026-02-17 17:28:53.287 186483 DEBUG nova.network.neutron [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 17 17:28:53 compute-0 nova_compute[186479]: 2026-02-17 17:28:53.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:28:53 compute-0 nova_compute[186479]: 2026-02-17 17:28:53.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 17 17:28:53 compute-0 nova_compute[186479]: 2026-02-17 17:28:53.322 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 17 17:28:53 compute-0 nova_compute[186479]: 2026-02-17 17:28:53.324 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:28:53 compute-0 nova_compute[186479]: 2026-02-17 17:28:53.324 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 17 17:28:53 compute-0 nova_compute[186479]: 2026-02-17 17:28:53.337 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:28:53 compute-0 nova_compute[186479]: 2026-02-17 17:28:53.459 186483 DEBUG nova.network.neutron [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 17 17:28:53 compute-0 nova_compute[186479]: 2026-02-17 17:28:53.808 186483 DEBUG nova.compute.manager [req-3ca5946b-42e3-4620-b75e-cbeb91ea6d92 req-13493d34-2f38-4b88-8278-bdbb9c641c56 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Received event network-changed-32811792-c388-4fbf-ae6d-43c1e4b213bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:28:53 compute-0 nova_compute[186479]: 2026-02-17 17:28:53.809 186483 DEBUG nova.compute.manager [req-3ca5946b-42e3-4620-b75e-cbeb91ea6d92 req-13493d34-2f38-4b88-8278-bdbb9c641c56 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Refreshing instance network info cache due to event network-changed-32811792-c388-4fbf-ae6d-43c1e4b213bf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:28:53 compute-0 nova_compute[186479]: 2026-02-17 17:28:53.809 186483 DEBUG oslo_concurrency.lockutils [req-3ca5946b-42e3-4620-b75e-cbeb91ea6d92 req-13493d34-2f38-4b88-8278-bdbb9c641c56 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.741 186483 DEBUG nova.network.neutron [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Updating instance_info_cache with network_info: [{"id": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "address": "fa:16:3e:1f:63:df", "network": {"id": "669651c7-5ac9-4d2e-a8a1-366ce4dcd584", "bridge": "br-int", "label": "tempest-network-smoke--1906160902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32811792-c3", "ovs_interfaceid": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.760 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Releasing lock "refresh_cache-8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.760 186483 DEBUG nova.compute.manager [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Instance network_info: |[{"id": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "address": "fa:16:3e:1f:63:df", "network": {"id": "669651c7-5ac9-4d2e-a8a1-366ce4dcd584", "bridge": "br-int", "label": "tempest-network-smoke--1906160902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32811792-c3", "ovs_interfaceid": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.761 186483 DEBUG oslo_concurrency.lockutils [req-3ca5946b-42e3-4620-b75e-cbeb91ea6d92 req-13493d34-2f38-4b88-8278-bdbb9c641c56 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.761 186483 DEBUG nova.network.neutron [req-3ca5946b-42e3-4620-b75e-cbeb91ea6d92 req-13493d34-2f38-4b88-8278-bdbb9c641c56 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Refreshing network info cache for port 32811792-c388-4fbf-ae6d-43c1e4b213bf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.765 186483 DEBUG nova.virt.libvirt.driver [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Start _get_guest_xml network_info=[{"id": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "address": "fa:16:3e:1f:63:df", "network": {"id": "669651c7-5ac9-4d2e-a8a1-366ce4dcd584", "bridge": "br-int", "label": "tempest-network-smoke--1906160902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32811792-c3", "ovs_interfaceid": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_format': None, 'encryption_options': None, 'size': 0, 'encrypted': False, 'image_id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.771 186483 WARNING nova.virt.libvirt.driver [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.775 186483 DEBUG nova.virt.libvirt.host [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.776 186483 DEBUG nova.virt.libvirt.host [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.785 186483 DEBUG nova.virt.libvirt.host [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.785 186483 DEBUG nova.virt.libvirt.host [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.786 186483 DEBUG nova.virt.libvirt.driver [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.786 186483 DEBUG nova.virt.hardware [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-17T17:28:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='5ebd41ec-8360-4181-bce3-0c0dc586cdb2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.787 186483 DEBUG nova.virt.hardware [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.787 186483 DEBUG nova.virt.hardware [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.787 186483 DEBUG nova.virt.hardware [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.787 186483 DEBUG nova.virt.hardware [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.787 186483 DEBUG nova.virt.hardware [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.788 186483 DEBUG nova.virt.hardware [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.788 186483 DEBUG nova.virt.hardware [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.788 186483 DEBUG nova.virt.hardware [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.788 186483 DEBUG nova.virt.hardware [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.788 186483 DEBUG nova.virt.hardware [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.804 186483 DEBUG nova.privsep.utils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.805 186483 DEBUG nova.virt.libvirt.vif [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:28:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-572755466',display_name='tempest-TestNetworkBasicOps-server-572755466',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-572755466',id=1,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDujMpTWlrR3Ei876npo995e/qq2Qz02Rp1wCi/x8Ta1eJB1HmxKd48/OhgRcLr6r+tp+DDPpmJ2LCpL/r9Wcf04pbpqIidfhknfTDr1PIqBOZQW02EVQTDBpRV4oH7OBA==',key_name='tempest-TestNetworkBasicOps-599335326',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-birwp627',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:28:48Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "address": "fa:16:3e:1f:63:df", "network": {"id": "669651c7-5ac9-4d2e-a8a1-366ce4dcd584", "bridge": "br-int", "label": "tempest-network-smoke--1906160902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32811792-c3", "ovs_interfaceid": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.805 186483 DEBUG nova.network.os_vif_util [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "address": "fa:16:3e:1f:63:df", "network": {"id": "669651c7-5ac9-4d2e-a8a1-366ce4dcd584", "bridge": "br-int", "label": "tempest-network-smoke--1906160902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32811792-c3", "ovs_interfaceid": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.806 186483 DEBUG nova.network.os_vif_util [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:63:df,bridge_name='br-int',has_traffic_filtering=True,id=32811792-c388-4fbf-ae6d-43c1e4b213bf,network=Network(669651c7-5ac9-4d2e-a8a1-366ce4dcd584),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32811792-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.808 186483 DEBUG nova.objects.instance [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'pci_devices' on Instance uuid 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.901 186483 DEBUG nova.virt.libvirt.driver [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] End _get_guest_xml xml=<domain type="kvm">
Feb 17 17:28:54 compute-0 nova_compute[186479]:   <uuid>8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa</uuid>
Feb 17 17:28:54 compute-0 nova_compute[186479]:   <name>instance-00000001</name>
Feb 17 17:28:54 compute-0 nova_compute[186479]:   <memory>131072</memory>
Feb 17 17:28:54 compute-0 nova_compute[186479]:   <vcpu>1</vcpu>
Feb 17 17:28:54 compute-0 nova_compute[186479]:   <metadata>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <nova:name>tempest-TestNetworkBasicOps-server-572755466</nova:name>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <nova:creationTime>2026-02-17 17:28:54</nova:creationTime>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <nova:flavor name="m1.nano">
Feb 17 17:28:54 compute-0 nova_compute[186479]:         <nova:memory>128</nova:memory>
Feb 17 17:28:54 compute-0 nova_compute[186479]:         <nova:disk>1</nova:disk>
Feb 17 17:28:54 compute-0 nova_compute[186479]:         <nova:swap>0</nova:swap>
Feb 17 17:28:54 compute-0 nova_compute[186479]:         <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:28:54 compute-0 nova_compute[186479]:         <nova:vcpus>1</nova:vcpus>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       </nova:flavor>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <nova:owner>
Feb 17 17:28:54 compute-0 nova_compute[186479]:         <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:28:54 compute-0 nova_compute[186479]:         <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       </nova:owner>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <nova:ports>
Feb 17 17:28:54 compute-0 nova_compute[186479]:         <nova:port uuid="32811792-c388-4fbf-ae6d-43c1e4b213bf">
Feb 17 17:28:54 compute-0 nova_compute[186479]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:         </nova:port>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       </nova:ports>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     </nova:instance>
Feb 17 17:28:54 compute-0 nova_compute[186479]:   </metadata>
Feb 17 17:28:54 compute-0 nova_compute[186479]:   <sysinfo type="smbios">
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <system>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <entry name="manufacturer">RDO</entry>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <entry name="product">OpenStack Compute</entry>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <entry name="serial">8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa</entry>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <entry name="uuid">8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa</entry>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <entry name="family">Virtual Machine</entry>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     </system>
Feb 17 17:28:54 compute-0 nova_compute[186479]:   </sysinfo>
Feb 17 17:28:54 compute-0 nova_compute[186479]:   <os>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <boot dev="hd"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <smbios mode="sysinfo"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:   </os>
Feb 17 17:28:54 compute-0 nova_compute[186479]:   <features>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <acpi/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <apic/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <vmcoreinfo/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:   </features>
Feb 17 17:28:54 compute-0 nova_compute[186479]:   <clock offset="utc">
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <timer name="pit" tickpolicy="delay"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <timer name="hpet" present="no"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:   </clock>
Feb 17 17:28:54 compute-0 nova_compute[186479]:   <cpu mode="host-model" match="exact">
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <topology sockets="1" cores="1" threads="1"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:28:54 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <disk type="file" device="disk">
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa/disk"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <target dev="vda" bus="virtio"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <disk type="file" device="cdrom">
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <driver name="qemu" type="raw" cache="none"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa/disk.config"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <target dev="sda" bus="sata"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <interface type="ethernet">
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <mac address="fa:16:3e:1f:63:df"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <driver name="vhost" rx_queue_size="512"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <mtu size="1442"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <target dev="tap32811792-c3"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <serial type="pty">
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <log file="/var/lib/nova/instances/8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa/console.log" append="off"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     </serial>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <video>
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     </video>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <input type="tablet" bus="usb"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <rng model="virtio">
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <backend model="random">/dev/urandom</backend>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <controller type="usb" index="0"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     <memballoon model="virtio">
Feb 17 17:28:54 compute-0 nova_compute[186479]:       <stats period="10"/>
Feb 17 17:28:54 compute-0 nova_compute[186479]:     </memballoon>
Feb 17 17:28:54 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:28:54 compute-0 nova_compute[186479]: </domain>
Feb 17 17:28:54 compute-0 nova_compute[186479]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.903 186483 DEBUG nova.compute.manager [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Preparing to wait for external event network-vif-plugged-32811792-c388-4fbf-ae6d-43c1e4b213bf prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.903 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.904 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.904 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.905 186483 DEBUG nova.virt.libvirt.vif [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:28:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-572755466',display_name='tempest-TestNetworkBasicOps-server-572755466',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-572755466',id=1,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDujMpTWlrR3Ei876npo995e/qq2Qz02Rp1wCi/x8Ta1eJB1HmxKd48/OhgRcLr6r+tp+DDPpmJ2LCpL/r9Wcf04pbpqIidfhknfTDr1PIqBOZQW02EVQTDBpRV4oH7OBA==',key_name='tempest-TestNetworkBasicOps-599335326',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-birwp627',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:28:48Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "address": "fa:16:3e:1f:63:df", "network": {"id": "669651c7-5ac9-4d2e-a8a1-366ce4dcd584", "bridge": "br-int", "label": "tempest-network-smoke--1906160902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32811792-c3", "ovs_interfaceid": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.906 186483 DEBUG nova.network.os_vif_util [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "address": "fa:16:3e:1f:63:df", "network": {"id": "669651c7-5ac9-4d2e-a8a1-366ce4dcd584", "bridge": "br-int", "label": "tempest-network-smoke--1906160902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32811792-c3", "ovs_interfaceid": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.907 186483 DEBUG nova.network.os_vif_util [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:63:df,bridge_name='br-int',has_traffic_filtering=True,id=32811792-c388-4fbf-ae6d-43c1e4b213bf,network=Network(669651c7-5ac9-4d2e-a8a1-366ce4dcd584),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32811792-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.907 186483 DEBUG os_vif [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:63:df,bridge_name='br-int',has_traffic_filtering=True,id=32811792-c388-4fbf-ae6d-43c1e4b213bf,network=Network(669651c7-5ac9-4d2e-a8a1-366ce4dcd584),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32811792-c3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.964 186483 DEBUG ovsdbapp.backend.ovs_idl [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.964 186483 DEBUG ovsdbapp.backend.ovs_idl [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.965 186483 DEBUG ovsdbapp.backend.ovs_idl [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.966 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.967 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.967 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.968 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.970 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.974 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.991 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.991 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.992 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:28:54 compute-0 nova_compute[186479]: 2026-02-17 17:28:54.993 186483 INFO oslo.privsep.daemon [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpemn4rj1p/privsep.sock']
Feb 17 17:28:55 compute-0 nova_compute[186479]: 2026-02-17 17:28:55.626 186483 INFO oslo.privsep.daemon [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Spawned new privsep daemon via rootwrap
Feb 17 17:28:55 compute-0 nova_compute[186479]: 2026-02-17 17:28:55.468 215158 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 17 17:28:55 compute-0 nova_compute[186479]: 2026-02-17 17:28:55.473 215158 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 17 17:28:55 compute-0 nova_compute[186479]: 2026-02-17 17:28:55.474 215158 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Feb 17 17:28:55 compute-0 nova_compute[186479]: 2026-02-17 17:28:55.475 215158 INFO oslo.privsep.daemon [-] privsep daemon running as pid 215158
Feb 17 17:28:55 compute-0 nova_compute[186479]: 2026-02-17 17:28:55.920 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:28:55 compute-0 nova_compute[186479]: 2026-02-17 17:28:55.920 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap32811792-c3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:28:55 compute-0 nova_compute[186479]: 2026-02-17 17:28:55.921 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap32811792-c3, col_values=(('external_ids', {'iface-id': '32811792-c388-4fbf-ae6d-43c1e4b213bf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1f:63:df', 'vm-uuid': '8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:28:55 compute-0 nova_compute[186479]: 2026-02-17 17:28:55.922 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:28:55 compute-0 NetworkManager[56323]: <info>  [1771349335.9236] manager: (tap32811792-c3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Feb 17 17:28:55 compute-0 nova_compute[186479]: 2026-02-17 17:28:55.925 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:28:55 compute-0 nova_compute[186479]: 2026-02-17 17:28:55.928 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:28:55 compute-0 nova_compute[186479]: 2026-02-17 17:28:55.930 186483 INFO os_vif [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:63:df,bridge_name='br-int',has_traffic_filtering=True,id=32811792-c388-4fbf-ae6d-43c1e4b213bf,network=Network(669651c7-5ac9-4d2e-a8a1-366ce4dcd584),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32811792-c3')
Feb 17 17:28:55 compute-0 nova_compute[186479]: 2026-02-17 17:28:55.991 186483 DEBUG nova.virt.libvirt.driver [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:28:55 compute-0 nova_compute[186479]: 2026-02-17 17:28:55.992 186483 DEBUG nova.virt.libvirt.driver [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:28:55 compute-0 nova_compute[186479]: 2026-02-17 17:28:55.992 186483 DEBUG nova.virt.libvirt.driver [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No VIF found with MAC fa:16:3e:1f:63:df, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 17 17:28:55 compute-0 nova_compute[186479]: 2026-02-17 17:28:55.993 186483 INFO nova.virt.libvirt.driver [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Using config drive
Feb 17 17:28:56 compute-0 nova_compute[186479]: 2026-02-17 17:28:56.276 186483 DEBUG nova.network.neutron [req-3ca5946b-42e3-4620-b75e-cbeb91ea6d92 req-13493d34-2f38-4b88-8278-bdbb9c641c56 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Updated VIF entry in instance network info cache for port 32811792-c388-4fbf-ae6d-43c1e4b213bf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:28:56 compute-0 nova_compute[186479]: 2026-02-17 17:28:56.277 186483 DEBUG nova.network.neutron [req-3ca5946b-42e3-4620-b75e-cbeb91ea6d92 req-13493d34-2f38-4b88-8278-bdbb9c641c56 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Updating instance_info_cache with network_info: [{"id": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "address": "fa:16:3e:1f:63:df", "network": {"id": "669651c7-5ac9-4d2e-a8a1-366ce4dcd584", "bridge": "br-int", "label": "tempest-network-smoke--1906160902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32811792-c3", "ovs_interfaceid": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:28:56 compute-0 nova_compute[186479]: 2026-02-17 17:28:56.292 186483 DEBUG oslo_concurrency.lockutils [req-3ca5946b-42e3-4620-b75e-cbeb91ea6d92 req-13493d34-2f38-4b88-8278-bdbb9c641c56 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:28:56 compute-0 nova_compute[186479]: 2026-02-17 17:28:56.348 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:28:56 compute-0 nova_compute[186479]: 2026-02-17 17:28:56.349 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:28:56 compute-0 nova_compute[186479]: 2026-02-17 17:28:56.349 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:28:56 compute-0 nova_compute[186479]: 2026-02-17 17:28:56.349 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:28:56 compute-0 nova_compute[186479]: 2026-02-17 17:28:56.535 186483 INFO nova.virt.libvirt.driver [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Creating config drive at /var/lib/nova/instances/8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa/disk.config
Feb 17 17:28:56 compute-0 nova_compute[186479]: 2026-02-17 17:28:56.543 186483 DEBUG oslo_concurrency.processutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp6n48_av8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:28:56 compute-0 nova_compute[186479]: 2026-02-17 17:28:56.665 186483 DEBUG oslo_concurrency.processutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp6n48_av8" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:28:56 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Feb 17 17:28:56 compute-0 kernel: tap32811792-c3: entered promiscuous mode
Feb 17 17:28:56 compute-0 NetworkManager[56323]: <info>  [1771349336.7235] manager: (tap32811792-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/20)
Feb 17 17:28:56 compute-0 nova_compute[186479]: 2026-02-17 17:28:56.724 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:28:56 compute-0 ovn_controller[96568]: 2026-02-17T17:28:56Z|00027|binding|INFO|Claiming lport 32811792-c388-4fbf-ae6d-43c1e4b213bf for this chassis.
Feb 17 17:28:56 compute-0 ovn_controller[96568]: 2026-02-17T17:28:56Z|00028|binding|INFO|32811792-c388-4fbf-ae6d-43c1e4b213bf: Claiming fa:16:3e:1f:63:df 10.100.0.10
Feb 17 17:28:56 compute-0 nova_compute[186479]: 2026-02-17 17:28:56.727 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:28:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:56.739 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:63:df 10.100.0.10'], port_security=['fa:16:3e:1f:63:df 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-669651c7-5ac9-4d2e-a8a1-366ce4dcd584', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '05969d3c-be24-4ef7-ba4b-be8a5826cb11', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=022c023e-9b10-4231-98a6-e0cad7ebafd6, chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=32811792-c388-4fbf-ae6d-43c1e4b213bf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:28:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:56.740 105898 INFO neutron.agent.ovn.metadata.agent [-] Port 32811792-c388-4fbf-ae6d-43c1e4b213bf in datapath 669651c7-5ac9-4d2e-a8a1-366ce4dcd584 bound to our chassis
Feb 17 17:28:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:56.742 105898 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 669651c7-5ac9-4d2e-a8a1-366ce4dcd584
Feb 17 17:28:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:56.743 105898 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpdfdsvzp9/privsep.sock']
Feb 17 17:28:56 compute-0 systemd-udevd[215184]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:28:56 compute-0 nova_compute[186479]: 2026-02-17 17:28:56.760 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:28:56 compute-0 NetworkManager[56323]: <info>  [1771349336.7646] device (tap32811792-c3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 17 17:28:56 compute-0 ovn_controller[96568]: 2026-02-17T17:28:56Z|00029|binding|INFO|Setting lport 32811792-c388-4fbf-ae6d-43c1e4b213bf ovn-installed in OVS
Feb 17 17:28:56 compute-0 ovn_controller[96568]: 2026-02-17T17:28:56Z|00030|binding|INFO|Setting lport 32811792-c388-4fbf-ae6d-43c1e4b213bf up in Southbound
Feb 17 17:28:56 compute-0 NetworkManager[56323]: <info>  [1771349336.7662] device (tap32811792-c3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 17 17:28:56 compute-0 nova_compute[186479]: 2026-02-17 17:28:56.767 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:28:56 compute-0 systemd-machined[155877]: New machine qemu-1-instance-00000001.
Feb 17 17:28:56 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.003 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349337.0029676, 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.004 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] VM Started (Lifecycle Event)
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.008 186483 DEBUG nova.compute.manager [req-5ed7f975-42f4-4922-b6d8-46b45967d5d7 req-c79b845b-cc09-4265-ab99-3d76dfd7042b 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Received event network-vif-plugged-32811792-c388-4fbf-ae6d-43c1e4b213bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.008 186483 DEBUG oslo_concurrency.lockutils [req-5ed7f975-42f4-4922-b6d8-46b45967d5d7 req-c79b845b-cc09-4265-ab99-3d76dfd7042b 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.008 186483 DEBUG oslo_concurrency.lockutils [req-5ed7f975-42f4-4922-b6d8-46b45967d5d7 req-c79b845b-cc09-4265-ab99-3d76dfd7042b 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.008 186483 DEBUG oslo_concurrency.lockutils [req-5ed7f975-42f4-4922-b6d8-46b45967d5d7 req-c79b845b-cc09-4265-ab99-3d76dfd7042b 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.009 186483 DEBUG nova.compute.manager [req-5ed7f975-42f4-4922-b6d8-46b45967d5d7 req-c79b845b-cc09-4265-ab99-3d76dfd7042b 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Processing event network-vif-plugged-32811792-c388-4fbf-ae6d-43c1e4b213bf _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.009 186483 DEBUG nova.compute.manager [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.020 186483 DEBUG nova.virt.libvirt.driver [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.023 186483 INFO nova.virt.libvirt.driver [-] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Instance spawned successfully.
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.023 186483 DEBUG nova.virt.libvirt.driver [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.078 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.080 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.103 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.104 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349337.003129, 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.104 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] VM Paused (Lifecycle Event)
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.136 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.141 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349337.020319, 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.141 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] VM Resumed (Lifecycle Event)
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.145 186483 DEBUG nova.virt.libvirt.driver [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.146 186483 DEBUG nova.virt.libvirt.driver [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.146 186483 DEBUG nova.virt.libvirt.driver [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.147 186483 DEBUG nova.virt.libvirt.driver [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.147 186483 DEBUG nova.virt.libvirt.driver [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.148 186483 DEBUG nova.virt.libvirt.driver [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.181 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.185 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.203 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.238 186483 INFO nova.compute.manager [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Took 8.41 seconds to spawn the instance on the hypervisor.
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.239 186483 DEBUG nova.compute.manager [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.298 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.328 186483 INFO nova.compute.manager [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Took 8.89 seconds to build instance.
Feb 17 17:28:57 compute-0 nova_compute[186479]: 2026-02-17 17:28:57.344 186483 DEBUG oslo_concurrency.lockutils [None req-d98aeae5-60a4-4fe7-b9b5-9a3c6ac38730 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:28:57 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:57.479 105898 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 17 17:28:57 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:57.480 105898 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpdfdsvzp9/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 17 17:28:57 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:57.319 215208 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 17 17:28:57 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:57.324 215208 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 17 17:28:57 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:57.327 215208 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Feb 17 17:28:57 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:57.327 215208 INFO oslo.privsep.daemon [-] privsep daemon running as pid 215208
Feb 17 17:28:57 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:57.482 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[4c53938d-fbc0-40c0-99a4-f2516842325e]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:28:58 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:58.047 215208 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:28:58 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:58.048 215208 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:28:58 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:58.048 215208 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:28:58 compute-0 nova_compute[186479]: 2026-02-17 17:28:58.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:28:58 compute-0 nova_compute[186479]: 2026-02-17 17:28:58.303 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:28:58 compute-0 nova_compute[186479]: 2026-02-17 17:28:58.303 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 17 17:28:58 compute-0 nova_compute[186479]: 2026-02-17 17:28:58.340 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:28:58 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:58.634 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[bb0c49fe-1a62-4095-9e3e-feb7c773050f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:28:58 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:58.635 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap669651c7-51 in ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 17 17:28:58 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:58.637 215208 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap669651c7-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 17 17:28:58 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:58.637 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[4ec2d722-a2de-43d8-9897-01bc2d744008]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:28:58 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:58.639 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[caacb9ee-056d-4412-954e-afa0d3a98364]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:28:58 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:58.656 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[c4b4d60c-9362-4283-bee0-cb097b44cafd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:28:58 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:58.665 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[be74849a-a028-4c40-885a-ce4cbaba8735]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:28:58 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:58.667 105898 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp5d9a0bj7/privsep.sock']
Feb 17 17:28:59 compute-0 nova_compute[186479]: 2026-02-17 17:28:59.085 186483 DEBUG nova.compute.manager [req-b48c64cb-005c-4944-8750-de5a8e4121bf req-d3838169-0dfb-404e-be8e-451457631915 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Received event network-vif-plugged-32811792-c388-4fbf-ae6d-43c1e4b213bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:28:59 compute-0 nova_compute[186479]: 2026-02-17 17:28:59.086 186483 DEBUG oslo_concurrency.lockutils [req-b48c64cb-005c-4944-8750-de5a8e4121bf req-d3838169-0dfb-404e-be8e-451457631915 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:28:59 compute-0 nova_compute[186479]: 2026-02-17 17:28:59.086 186483 DEBUG oslo_concurrency.lockutils [req-b48c64cb-005c-4944-8750-de5a8e4121bf req-d3838169-0dfb-404e-be8e-451457631915 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:28:59 compute-0 nova_compute[186479]: 2026-02-17 17:28:59.086 186483 DEBUG oslo_concurrency.lockutils [req-b48c64cb-005c-4944-8750-de5a8e4121bf req-d3838169-0dfb-404e-be8e-451457631915 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:28:59 compute-0 nova_compute[186479]: 2026-02-17 17:28:59.086 186483 DEBUG nova.compute.manager [req-b48c64cb-005c-4944-8750-de5a8e4121bf req-d3838169-0dfb-404e-be8e-451457631915 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] No waiting events found dispatching network-vif-plugged-32811792-c388-4fbf-ae6d-43c1e4b213bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:28:59 compute-0 nova_compute[186479]: 2026-02-17 17:28:59.087 186483 WARNING nova.compute.manager [req-b48c64cb-005c-4944-8750-de5a8e4121bf req-d3838169-0dfb-404e-be8e-451457631915 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Received unexpected event network-vif-plugged-32811792-c388-4fbf-ae6d-43c1e4b213bf for instance with vm_state active and task_state None.
Feb 17 17:28:59 compute-0 nova_compute[186479]: 2026-02-17 17:28:59.255 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "refresh_cache-8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:28:59 compute-0 nova_compute[186479]: 2026-02-17 17:28:59.255 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquired lock "refresh_cache-8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:28:59 compute-0 nova_compute[186479]: 2026-02-17 17:28:59.256 186483 DEBUG nova.network.neutron [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 17 17:28:59 compute-0 nova_compute[186479]: 2026-02-17 17:28:59.256 186483 DEBUG nova.objects.instance [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:28:59 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:59.261 105898 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 17 17:28:59 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:59.262 105898 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp5d9a0bj7/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 17 17:28:59 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:59.158 215222 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 17 17:28:59 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:59.162 215222 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 17 17:28:59 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:59.165 215222 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 17 17:28:59 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:59.165 215222 INFO oslo.privsep.daemon [-] privsep daemon running as pid 215222
Feb 17 17:28:59 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:59.264 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[cec9517e-b440-4054-880f-5b473fc71cb3]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:28:59 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:59.803 215222 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:28:59 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:59.803 215222 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:28:59 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:28:59.803 215222 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:00.351 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[cc66cca3-66e6-442b-ab11-840481ffd7a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:00 compute-0 NetworkManager[56323]: <info>  [1771349340.3723] manager: (tap669651c7-50): new Veth device (/org/freedesktop/NetworkManager/Devices/21)
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:00.373 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[72a14b7f-181d-4da6-be0d-af535b1076ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:00 compute-0 systemd-udevd[215234]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:00.396 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[08693a8c-534c-440a-bb4a-879b50620e2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:00.400 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[98cac3b2-597e-49a8-87d8-66b5d6594a11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:00 compute-0 NetworkManager[56323]: <info>  [1771349340.4220] device (tap669651c7-50): carrier: link connected
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:00.422 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[6bfe5766-9685-4544-9cae-fa1d4db6f0a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:00.441 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[3a2f23ba-6bad-499b-9cd7-f3bf9db23ca0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap669651c7-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:0e:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 309573, 'reachable_time': 17350, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215252, 'error': None, 'target': 'ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:00.457 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[50a2a577-a48b-4886-ad10-98f47b16d772]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe52:e62'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 309573, 'tstamp': 309573}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215253, 'error': None, 'target': 'ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:00.472 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[4e7a3d1d-4549-4325-9459-20d20c859ec3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap669651c7-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:0e:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 309573, 'reachable_time': 17350, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215254, 'error': None, 'target': 'ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:00.502 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[3f896c21-4995-44b0-974e-ff17e840a4f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:00.557 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[fd1d7663-dcf8-43e8-b4ec-c7fd221cecc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:00.558 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap669651c7-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:00.559 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:00.559 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap669651c7-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:29:00 compute-0 nova_compute[186479]: 2026-02-17 17:29:00.561 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:00 compute-0 NetworkManager[56323]: <info>  [1771349340.5622] manager: (tap669651c7-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Feb 17 17:29:00 compute-0 kernel: tap669651c7-50: entered promiscuous mode
Feb 17 17:29:00 compute-0 nova_compute[186479]: 2026-02-17 17:29:00.564 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:00.568 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap669651c7-50, col_values=(('external_ids', {'iface-id': '12e253c0-47eb-4a49-a8cd-f30bfe93bab7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:29:00 compute-0 ovn_controller[96568]: 2026-02-17T17:29:00Z|00031|binding|INFO|Releasing lport 12e253c0-47eb-4a49-a8cd-f30bfe93bab7 from this chassis (sb_readonly=0)
Feb 17 17:29:00 compute-0 nova_compute[186479]: 2026-02-17 17:29:00.569 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:00 compute-0 nova_compute[186479]: 2026-02-17 17:29:00.570 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:00.570 105898 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/669651c7-5ac9-4d2e-a8a1-366ce4dcd584.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/669651c7-5ac9-4d2e-a8a1-366ce4dcd584.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 17 17:29:00 compute-0 nova_compute[186479]: 2026-02-17 17:29:00.574 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:00.575 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[2492d4f2-2b1c-4b07-85a9-41e23bd40aab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:00.577 105898 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: global
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:     log         /dev/log local0 debug
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:     log-tag     haproxy-metadata-proxy-669651c7-5ac9-4d2e-a8a1-366ce4dcd584
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:     user        root
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:     group       root
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:     maxconn     1024
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:     pidfile     /var/lib/neutron/external/pids/669651c7-5ac9-4d2e-a8a1-366ce4dcd584.pid.haproxy
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:     daemon
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: defaults
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:     log global
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:     mode http
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:     option httplog
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:     option dontlognull
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:     option http-server-close
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:     option forwardfor
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:     retries                 3
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:     timeout http-request    30s
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:     timeout connect         30s
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:     timeout client          32s
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:     timeout server          32s
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:     timeout http-keep-alive 30s
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: listen listener
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:     bind 169.254.169.254:80
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:     server metadata /var/lib/neutron/metadata_proxy
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:     http-request add-header X-OVN-Network-ID 669651c7-5ac9-4d2e-a8a1-366ce4dcd584
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 17 17:29:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:00.578 105898 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584', 'env', 'PROCESS_TAG=haproxy-669651c7-5ac9-4d2e-a8a1-366ce4dcd584', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/669651c7-5ac9-4d2e-a8a1-366ce4dcd584.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 17 17:29:00 compute-0 podman[215265]: 2026-02-17 17:29:00.73342225 +0000 UTC m=+0.075391179 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 17 17:29:00 compute-0 nova_compute[186479]: 2026-02-17 17:29:00.922 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:00 compute-0 podman[215314]: 2026-02-17 17:29:00.945506817 +0000 UTC m=+0.075022219 container create 1fd213c7e3921594f949833679fc1087459c933da6358dbe66bbb7e5181b4419 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 17 17:29:00 compute-0 systemd[1]: Started libpod-conmon-1fd213c7e3921594f949833679fc1087459c933da6358dbe66bbb7e5181b4419.scope.
Feb 17 17:29:00 compute-0 podman[215314]: 2026-02-17 17:29:00.895504589 +0000 UTC m=+0.025020041 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 17 17:29:01 compute-0 systemd[1]: Started libcrun container.
Feb 17 17:29:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e87618e7a10dc4b2dbbeacab27c93c2cfd99523cf1f9a54c660c9b26419c9db/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 17 17:29:01 compute-0 podman[215314]: 2026-02-17 17:29:01.034255912 +0000 UTC m=+0.163771344 container init 1fd213c7e3921594f949833679fc1087459c933da6358dbe66bbb7e5181b4419 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 17 17:29:01 compute-0 podman[215314]: 2026-02-17 17:29:01.038461881 +0000 UTC m=+0.167977283 container start 1fd213c7e3921594f949833679fc1087459c933da6358dbe66bbb7e5181b4419 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 17 17:29:01 compute-0 neutron-haproxy-ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584[215330]: [NOTICE]   (215334) : New worker (215336) forked
Feb 17 17:29:01 compute-0 neutron-haproxy-ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584[215330]: [NOTICE]   (215334) : Loading success.
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.413 186483 DEBUG nova.network.neutron [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Updating instance_info_cache with network_info: [{"id": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "address": "fa:16:3e:1f:63:df", "network": {"id": "669651c7-5ac9-4d2e-a8a1-366ce4dcd584", "bridge": "br-int", "label": "tempest-network-smoke--1906160902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32811792-c3", "ovs_interfaceid": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.432 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Releasing lock "refresh_cache-8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.433 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.433 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.433 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.434 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.453 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.454 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.454 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.455 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.516 186483 DEBUG oslo_concurrency.processutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.585 186483 DEBUG oslo_concurrency.processutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.586 186483 DEBUG oslo_concurrency.processutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.633 186483 DEBUG oslo_concurrency.processutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.763 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.765 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5577MB free_disk=73.2085952758789GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.765 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.765 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.873 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Instance 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.873 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.873 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.937 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Refreshing inventories for resource provider c9b7a021-c13f-4158-9f46-47cefef2fece _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.994 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Updating ProviderTree inventory for provider c9b7a021-c13f-4158-9f46-47cefef2fece from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 17 17:29:02 compute-0 nova_compute[186479]: 2026-02-17 17:29:02.994 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Updating inventory in ProviderTree for provider c9b7a021-c13f-4158-9f46-47cefef2fece with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 17 17:29:03 compute-0 nova_compute[186479]: 2026-02-17 17:29:03.019 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Refreshing aggregate associations for resource provider c9b7a021-c13f-4158-9f46-47cefef2fece, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 17 17:29:03 compute-0 nova_compute[186479]: 2026-02-17 17:29:03.043 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Refreshing trait associations for resource provider c9b7a021-c13f-4158-9f46-47cefef2fece, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SHA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_BMI,HW_CPU_X86_ABM,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_BMI2,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NODE,HW_CPU_X86_SVM,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,HW_CPU_X86_AVX,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_FMA3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_RESCUE_BFV,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE42,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 17 17:29:03 compute-0 nova_compute[186479]: 2026-02-17 17:29:03.093 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Updating inventory in ProviderTree for provider c9b7a021-c13f-4158-9f46-47cefef2fece with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 17 17:29:03 compute-0 nova_compute[186479]: 2026-02-17 17:29:03.133 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Updated inventory for provider c9b7a021-c13f-4158-9f46-47cefef2fece with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Feb 17 17:29:03 compute-0 nova_compute[186479]: 2026-02-17 17:29:03.133 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Updating resource provider c9b7a021-c13f-4158-9f46-47cefef2fece generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 17 17:29:03 compute-0 nova_compute[186479]: 2026-02-17 17:29:03.133 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Updating inventory in ProviderTree for provider c9b7a021-c13f-4158-9f46-47cefef2fece with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 17 17:29:03 compute-0 nova_compute[186479]: 2026-02-17 17:29:03.158 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:29:03 compute-0 nova_compute[186479]: 2026-02-17 17:29:03.158 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.393s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:03 compute-0 nova_compute[186479]: 2026-02-17 17:29:03.341 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:03 compute-0 podman[215352]: 2026-02-17 17:29:03.73137867 +0000 UTC m=+0.072678839 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 17 17:29:03 compute-0 ovn_controller[96568]: 2026-02-17T17:29:03Z|00032|binding|INFO|Releasing lport 12e253c0-47eb-4a49-a8cd-f30bfe93bab7 from this chassis (sb_readonly=0)
Feb 17 17:29:03 compute-0 nova_compute[186479]: 2026-02-17 17:29:03.859 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:03 compute-0 NetworkManager[56323]: <info>  [1771349343.8693] manager: (patch-br-int-to-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/23)
Feb 17 17:29:03 compute-0 NetworkManager[56323]: <info>  [1771349343.8700] device (patch-br-int-to-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 17 17:29:03 compute-0 NetworkManager[56323]: <warn>  [1771349343.8703] device (patch-br-int-to-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 17 17:29:03 compute-0 NetworkManager[56323]: <info>  [1771349343.8711] manager: (patch-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/24)
Feb 17 17:29:03 compute-0 NetworkManager[56323]: <info>  [1771349343.8714] device (patch-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 17 17:29:03 compute-0 NetworkManager[56323]: <warn>  [1771349343.8714] device (patch-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 17 17:29:03 compute-0 NetworkManager[56323]: <info>  [1771349343.8721] manager: (patch-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Feb 17 17:29:03 compute-0 NetworkManager[56323]: <info>  [1771349343.8726] manager: (patch-br-int-to-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Feb 17 17:29:03 compute-0 NetworkManager[56323]: <info>  [1771349343.8731] device (patch-br-int-to-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Feb 17 17:29:03 compute-0 NetworkManager[56323]: <info>  [1771349343.8734] device (patch-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Feb 17 17:29:03 compute-0 ovn_controller[96568]: 2026-02-17T17:29:03Z|00033|binding|INFO|Releasing lport 12e253c0-47eb-4a49-a8cd-f30bfe93bab7 from this chassis (sb_readonly=0)
Feb 17 17:29:03 compute-0 nova_compute[186479]: 2026-02-17 17:29:03.875 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:03 compute-0 nova_compute[186479]: 2026-02-17 17:29:03.880 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:04 compute-0 nova_compute[186479]: 2026-02-17 17:29:04.114 186483 DEBUG nova.compute.manager [req-a0432485-a1dc-4273-9824-764ad473f163 req-c0940e14-35b5-4e0d-a761-641835363334 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Received event network-changed-32811792-c388-4fbf-ae6d-43c1e4b213bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:29:04 compute-0 nova_compute[186479]: 2026-02-17 17:29:04.115 186483 DEBUG nova.compute.manager [req-a0432485-a1dc-4273-9824-764ad473f163 req-c0940e14-35b5-4e0d-a761-641835363334 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Refreshing instance network info cache due to event network-changed-32811792-c388-4fbf-ae6d-43c1e4b213bf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:29:04 compute-0 nova_compute[186479]: 2026-02-17 17:29:04.116 186483 DEBUG oslo_concurrency.lockutils [req-a0432485-a1dc-4273-9824-764ad473f163 req-c0940e14-35b5-4e0d-a761-641835363334 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:29:04 compute-0 nova_compute[186479]: 2026-02-17 17:29:04.116 186483 DEBUG oslo_concurrency.lockutils [req-a0432485-a1dc-4273-9824-764ad473f163 req-c0940e14-35b5-4e0d-a761-641835363334 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:29:04 compute-0 nova_compute[186479]: 2026-02-17 17:29:04.116 186483 DEBUG nova.network.neutron [req-a0432485-a1dc-4273-9824-764ad473f163 req-c0940e14-35b5-4e0d-a761-641835363334 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Refreshing network info cache for port 32811792-c388-4fbf-ae6d-43c1e4b213bf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:29:05 compute-0 nova_compute[186479]: 2026-02-17 17:29:05.924 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:06 compute-0 nova_compute[186479]: 2026-02-17 17:29:06.606 186483 DEBUG nova.network.neutron [req-a0432485-a1dc-4273-9824-764ad473f163 req-c0940e14-35b5-4e0d-a761-641835363334 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Updated VIF entry in instance network info cache for port 32811792-c388-4fbf-ae6d-43c1e4b213bf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:29:06 compute-0 nova_compute[186479]: 2026-02-17 17:29:06.606 186483 DEBUG nova.network.neutron [req-a0432485-a1dc-4273-9824-764ad473f163 req-c0940e14-35b5-4e0d-a761-641835363334 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Updating instance_info_cache with network_info: [{"id": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "address": "fa:16:3e:1f:63:df", "network": {"id": "669651c7-5ac9-4d2e-a8a1-366ce4dcd584", "bridge": "br-int", "label": "tempest-network-smoke--1906160902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32811792-c3", "ovs_interfaceid": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:29:06 compute-0 nova_compute[186479]: 2026-02-17 17:29:06.623 186483 DEBUG oslo_concurrency.lockutils [req-a0432485-a1dc-4273-9824-764ad473f163 req-c0940e14-35b5-4e0d-a761-641835363334 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:29:08 compute-0 nova_compute[186479]: 2026-02-17 17:29:08.344 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:08 compute-0 ovn_controller[96568]: 2026-02-17T17:29:08Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1f:63:df 10.100.0.10
Feb 17 17:29:08 compute-0 ovn_controller[96568]: 2026-02-17T17:29:08Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1f:63:df 10.100.0.10
Feb 17 17:29:09 compute-0 podman[215387]: 2026-02-17 17:29:09.722010973 +0000 UTC m=+0.062615657 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 17 17:29:10 compute-0 nova_compute[186479]: 2026-02-17 17:29:10.927 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:10.945 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:10.948 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:10.950 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:13 compute-0 nova_compute[186479]: 2026-02-17 17:29:13.393 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:14 compute-0 nova_compute[186479]: 2026-02-17 17:29:14.678 186483 INFO nova.compute.manager [None req-974d72c5-b1a6-4547-bc69-b82563d59b83 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Get console output
Feb 17 17:29:14 compute-0 nova_compute[186479]: 2026-02-17 17:29:14.784 215112 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 17 17:29:15 compute-0 nova_compute[186479]: 2026-02-17 17:29:15.929 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:18 compute-0 nova_compute[186479]: 2026-02-17 17:29:18.396 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:18 compute-0 podman[215408]: 2026-02-17 17:29:18.708834499 +0000 UTC m=+0.054659220 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 17 17:29:18 compute-0 podman[215409]: 2026-02-17 17:29:18.722039473 +0000 UTC m=+0.063062579 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:29:20 compute-0 nova_compute[186479]: 2026-02-17 17:29:20.931 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:22 compute-0 podman[215445]: 2026-02-17 17:29:22.711791446 +0000 UTC m=+0.051872958 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 17 17:29:23 compute-0 nova_compute[186479]: 2026-02-17 17:29:23.398 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.423 186483 DEBUG oslo_concurrency.lockutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.424 186483 DEBUG oslo_concurrency.lockutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.441 186483 DEBUG nova.compute.manager [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.515 186483 DEBUG oslo_concurrency.lockutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.516 186483 DEBUG oslo_concurrency.lockutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.525 186483 DEBUG nova.virt.hardware [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.525 186483 INFO nova.compute.claims [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Claim successful on node compute-0.ctlplane.example.com
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.654 186483 DEBUG nova.compute.provider_tree [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.682 186483 DEBUG nova.scheduler.client.report [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.706 186483 DEBUG oslo_concurrency.lockutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.706 186483 DEBUG nova.compute.manager [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.744 186483 DEBUG nova.compute.manager [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.744 186483 DEBUG nova.network.neutron [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.760 186483 INFO nova.virt.libvirt.driver [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.776 186483 DEBUG nova.compute.manager [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.870 186483 DEBUG nova.compute.manager [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.871 186483 DEBUG nova.virt.libvirt.driver [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.872 186483 INFO nova.virt.libvirt.driver [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Creating image(s)
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.873 186483 DEBUG oslo_concurrency.lockutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "/var/lib/nova/instances/ee0a4091-de19-47a0-b8ea-04c5e762e3f0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.873 186483 DEBUG oslo_concurrency.lockutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/ee0a4091-de19-47a0-b8ea-04c5e762e3f0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.875 186483 DEBUG oslo_concurrency.lockutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/ee0a4091-de19-47a0-b8ea-04c5e762e3f0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.900 186483 DEBUG oslo_concurrency.processutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.975 186483 DEBUG oslo_concurrency.processutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.976 186483 DEBUG oslo_concurrency.lockutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:24 compute-0 nova_compute[186479]: 2026-02-17 17:29:24.977 186483 DEBUG oslo_concurrency.lockutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:25 compute-0 nova_compute[186479]: 2026-02-17 17:29:25.003 186483 DEBUG oslo_concurrency.processutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:29:25 compute-0 nova_compute[186479]: 2026-02-17 17:29:25.050 186483 DEBUG oslo_concurrency.processutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:29:25 compute-0 nova_compute[186479]: 2026-02-17 17:29:25.051 186483 DEBUG oslo_concurrency.processutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/ee0a4091-de19-47a0-b8ea-04c5e762e3f0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:29:25 compute-0 nova_compute[186479]: 2026-02-17 17:29:25.212 186483 DEBUG oslo_concurrency.processutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/ee0a4091-de19-47a0-b8ea-04c5e762e3f0/disk 1073741824" returned: 0 in 0.161s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:29:25 compute-0 nova_compute[186479]: 2026-02-17 17:29:25.213 186483 DEBUG oslo_concurrency.lockutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:25 compute-0 nova_compute[186479]: 2026-02-17 17:29:25.214 186483 DEBUG oslo_concurrency.processutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:29:25 compute-0 nova_compute[186479]: 2026-02-17 17:29:25.289 186483 DEBUG oslo_concurrency.processutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:29:25 compute-0 nova_compute[186479]: 2026-02-17 17:29:25.290 186483 DEBUG nova.virt.disk.api [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Checking if we can resize image /var/lib/nova/instances/ee0a4091-de19-47a0-b8ea-04c5e762e3f0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 17 17:29:25 compute-0 nova_compute[186479]: 2026-02-17 17:29:25.291 186483 DEBUG oslo_concurrency.processutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee0a4091-de19-47a0-b8ea-04c5e762e3f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:29:25 compute-0 nova_compute[186479]: 2026-02-17 17:29:25.307 186483 DEBUG nova.policy [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 17 17:29:25 compute-0 nova_compute[186479]: 2026-02-17 17:29:25.338 186483 DEBUG oslo_concurrency.processutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee0a4091-de19-47a0-b8ea-04c5e762e3f0/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:29:25 compute-0 nova_compute[186479]: 2026-02-17 17:29:25.339 186483 DEBUG nova.virt.disk.api [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Cannot resize image /var/lib/nova/instances/ee0a4091-de19-47a0-b8ea-04c5e762e3f0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 17 17:29:25 compute-0 nova_compute[186479]: 2026-02-17 17:29:25.339 186483 DEBUG nova.objects.instance [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'migration_context' on Instance uuid ee0a4091-de19-47a0-b8ea-04c5e762e3f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:29:25 compute-0 nova_compute[186479]: 2026-02-17 17:29:25.361 186483 DEBUG nova.virt.libvirt.driver [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 17 17:29:25 compute-0 nova_compute[186479]: 2026-02-17 17:29:25.362 186483 DEBUG nova.virt.libvirt.driver [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Ensure instance console log exists: /var/lib/nova/instances/ee0a4091-de19-47a0-b8ea-04c5e762e3f0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 17 17:29:25 compute-0 nova_compute[186479]: 2026-02-17 17:29:25.363 186483 DEBUG oslo_concurrency.lockutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:25 compute-0 nova_compute[186479]: 2026-02-17 17:29:25.363 186483 DEBUG oslo_concurrency.lockutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:25 compute-0 nova_compute[186479]: 2026-02-17 17:29:25.363 186483 DEBUG oslo_concurrency.lockutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:25 compute-0 nova_compute[186479]: 2026-02-17 17:29:25.933 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:26 compute-0 nova_compute[186479]: 2026-02-17 17:29:26.394 186483 DEBUG nova.network.neutron [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Successfully created port: 661bf6ca-ac54-4a17-9eaa-c4b3303f33cb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 17 17:29:27 compute-0 nova_compute[186479]: 2026-02-17 17:29:27.267 186483 DEBUG nova.network.neutron [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Successfully updated port: 661bf6ca-ac54-4a17-9eaa-c4b3303f33cb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 17 17:29:27 compute-0 nova_compute[186479]: 2026-02-17 17:29:27.281 186483 DEBUG oslo_concurrency.lockutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "refresh_cache-ee0a4091-de19-47a0-b8ea-04c5e762e3f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:29:27 compute-0 nova_compute[186479]: 2026-02-17 17:29:27.281 186483 DEBUG oslo_concurrency.lockutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquired lock "refresh_cache-ee0a4091-de19-47a0-b8ea-04c5e762e3f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:29:27 compute-0 nova_compute[186479]: 2026-02-17 17:29:27.282 186483 DEBUG nova.network.neutron [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 17 17:29:27 compute-0 nova_compute[186479]: 2026-02-17 17:29:27.352 186483 DEBUG nova.compute.manager [req-c3c84e8a-9e66-4d52-8c26-d9b567b81cee req-e2efb91e-c78e-4492-8cba-4b4f8b7e7ab5 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Received event network-changed-661bf6ca-ac54-4a17-9eaa-c4b3303f33cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:29:27 compute-0 nova_compute[186479]: 2026-02-17 17:29:27.352 186483 DEBUG nova.compute.manager [req-c3c84e8a-9e66-4d52-8c26-d9b567b81cee req-e2efb91e-c78e-4492-8cba-4b4f8b7e7ab5 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Refreshing instance network info cache due to event network-changed-661bf6ca-ac54-4a17-9eaa-c4b3303f33cb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:29:27 compute-0 nova_compute[186479]: 2026-02-17 17:29:27.353 186483 DEBUG oslo_concurrency.lockutils [req-c3c84e8a-9e66-4d52-8c26-d9b567b81cee req-e2efb91e-c78e-4492-8cba-4b4f8b7e7ab5 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-ee0a4091-de19-47a0-b8ea-04c5e762e3f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:29:27 compute-0 nova_compute[186479]: 2026-02-17 17:29:27.392 186483 DEBUG nova.network.neutron [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.298 186483 DEBUG nova.network.neutron [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Updating instance_info_cache with network_info: [{"id": "661bf6ca-ac54-4a17-9eaa-c4b3303f33cb", "address": "fa:16:3e:77:e5:c3", "network": {"id": "4b2e1182-92f8-44cc-8c9f-8f16dd796270", "bridge": "br-int", "label": "tempest-network-smoke--1074554491", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap661bf6ca-ac", "ovs_interfaceid": "661bf6ca-ac54-4a17-9eaa-c4b3303f33cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.328 186483 DEBUG oslo_concurrency.lockutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Releasing lock "refresh_cache-ee0a4091-de19-47a0-b8ea-04c5e762e3f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.329 186483 DEBUG nova.compute.manager [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Instance network_info: |[{"id": "661bf6ca-ac54-4a17-9eaa-c4b3303f33cb", "address": "fa:16:3e:77:e5:c3", "network": {"id": "4b2e1182-92f8-44cc-8c9f-8f16dd796270", "bridge": "br-int", "label": "tempest-network-smoke--1074554491", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap661bf6ca-ac", "ovs_interfaceid": "661bf6ca-ac54-4a17-9eaa-c4b3303f33cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.329 186483 DEBUG oslo_concurrency.lockutils [req-c3c84e8a-9e66-4d52-8c26-d9b567b81cee req-e2efb91e-c78e-4492-8cba-4b4f8b7e7ab5 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-ee0a4091-de19-47a0-b8ea-04c5e762e3f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.330 186483 DEBUG nova.network.neutron [req-c3c84e8a-9e66-4d52-8c26-d9b567b81cee req-e2efb91e-c78e-4492-8cba-4b4f8b7e7ab5 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Refreshing network info cache for port 661bf6ca-ac54-4a17-9eaa-c4b3303f33cb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.334 186483 DEBUG nova.virt.libvirt.driver [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Start _get_guest_xml network_info=[{"id": "661bf6ca-ac54-4a17-9eaa-c4b3303f33cb", "address": "fa:16:3e:77:e5:c3", "network": {"id": "4b2e1182-92f8-44cc-8c9f-8f16dd796270", "bridge": "br-int", "label": "tempest-network-smoke--1074554491", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap661bf6ca-ac", "ovs_interfaceid": "661bf6ca-ac54-4a17-9eaa-c4b3303f33cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_format': None, 'encryption_options': None, 'size': 0, 'encrypted': False, 'image_id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.339 186483 WARNING nova.virt.libvirt.driver [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.344 186483 DEBUG nova.virt.libvirt.host [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.345 186483 DEBUG nova.virt.libvirt.host [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.349 186483 DEBUG nova.virt.libvirt.host [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.350 186483 DEBUG nova.virt.libvirt.host [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.350 186483 DEBUG nova.virt.libvirt.driver [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.351 186483 DEBUG nova.virt.hardware [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-17T17:28:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='5ebd41ec-8360-4181-bce3-0c0dc586cdb2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.351 186483 DEBUG nova.virt.hardware [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.352 186483 DEBUG nova.virt.hardware [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.352 186483 DEBUG nova.virt.hardware [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.353 186483 DEBUG nova.virt.hardware [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.353 186483 DEBUG nova.virt.hardware [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.353 186483 DEBUG nova.virt.hardware [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.354 186483 DEBUG nova.virt.hardware [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.354 186483 DEBUG nova.virt.hardware [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.354 186483 DEBUG nova.virt.hardware [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.355 186483 DEBUG nova.virt.hardware [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.360 186483 DEBUG nova.virt.libvirt.vif [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:29:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1809346088',display_name='tempest-TestNetworkBasicOps-server-1809346088',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1809346088',id=2,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB5ULlRD2vQtmRfRh44x7/Sd0Ax8Q9hdx7xSs9HEKQbcmA+hsHRFEl04WH7D0mJhkCGdmQE0OoDqzOCJjz1VwLEyNiQb6SIbfN4NwvIbtMVknYAApCMcVWfRakX82Q6ukA==',key_name='tempest-TestNetworkBasicOps-579224632',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-oqhxv4c2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:29:24Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=ee0a4091-de19-47a0-b8ea-04c5e762e3f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "661bf6ca-ac54-4a17-9eaa-c4b3303f33cb", "address": "fa:16:3e:77:e5:c3", "network": {"id": "4b2e1182-92f8-44cc-8c9f-8f16dd796270", "bridge": "br-int", "label": "tempest-network-smoke--1074554491", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap661bf6ca-ac", "ovs_interfaceid": "661bf6ca-ac54-4a17-9eaa-c4b3303f33cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.361 186483 DEBUG nova.network.os_vif_util [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "661bf6ca-ac54-4a17-9eaa-c4b3303f33cb", "address": "fa:16:3e:77:e5:c3", "network": {"id": "4b2e1182-92f8-44cc-8c9f-8f16dd796270", "bridge": "br-int", "label": "tempest-network-smoke--1074554491", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap661bf6ca-ac", "ovs_interfaceid": "661bf6ca-ac54-4a17-9eaa-c4b3303f33cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.362 186483 DEBUG nova.network.os_vif_util [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:e5:c3,bridge_name='br-int',has_traffic_filtering=True,id=661bf6ca-ac54-4a17-9eaa-c4b3303f33cb,network=Network(4b2e1182-92f8-44cc-8c9f-8f16dd796270),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap661bf6ca-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.363 186483 DEBUG nova.objects.instance [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'pci_devices' on Instance uuid ee0a4091-de19-47a0-b8ea-04c5e762e3f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.380 186483 DEBUG nova.virt.libvirt.driver [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] End _get_guest_xml xml=<domain type="kvm">
Feb 17 17:29:28 compute-0 nova_compute[186479]:   <uuid>ee0a4091-de19-47a0-b8ea-04c5e762e3f0</uuid>
Feb 17 17:29:28 compute-0 nova_compute[186479]:   <name>instance-00000002</name>
Feb 17 17:29:28 compute-0 nova_compute[186479]:   <memory>131072</memory>
Feb 17 17:29:28 compute-0 nova_compute[186479]:   <vcpu>1</vcpu>
Feb 17 17:29:28 compute-0 nova_compute[186479]:   <metadata>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <nova:name>tempest-TestNetworkBasicOps-server-1809346088</nova:name>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <nova:creationTime>2026-02-17 17:29:28</nova:creationTime>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <nova:flavor name="m1.nano">
Feb 17 17:29:28 compute-0 nova_compute[186479]:         <nova:memory>128</nova:memory>
Feb 17 17:29:28 compute-0 nova_compute[186479]:         <nova:disk>1</nova:disk>
Feb 17 17:29:28 compute-0 nova_compute[186479]:         <nova:swap>0</nova:swap>
Feb 17 17:29:28 compute-0 nova_compute[186479]:         <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:29:28 compute-0 nova_compute[186479]:         <nova:vcpus>1</nova:vcpus>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       </nova:flavor>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <nova:owner>
Feb 17 17:29:28 compute-0 nova_compute[186479]:         <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:29:28 compute-0 nova_compute[186479]:         <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       </nova:owner>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <nova:ports>
Feb 17 17:29:28 compute-0 nova_compute[186479]:         <nova:port uuid="661bf6ca-ac54-4a17-9eaa-c4b3303f33cb">
Feb 17 17:29:28 compute-0 nova_compute[186479]:           <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:         </nova:port>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       </nova:ports>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     </nova:instance>
Feb 17 17:29:28 compute-0 nova_compute[186479]:   </metadata>
Feb 17 17:29:28 compute-0 nova_compute[186479]:   <sysinfo type="smbios">
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <system>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <entry name="manufacturer">RDO</entry>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <entry name="product">OpenStack Compute</entry>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <entry name="serial">ee0a4091-de19-47a0-b8ea-04c5e762e3f0</entry>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <entry name="uuid">ee0a4091-de19-47a0-b8ea-04c5e762e3f0</entry>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <entry name="family">Virtual Machine</entry>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     </system>
Feb 17 17:29:28 compute-0 nova_compute[186479]:   </sysinfo>
Feb 17 17:29:28 compute-0 nova_compute[186479]:   <os>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <boot dev="hd"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <smbios mode="sysinfo"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:   </os>
Feb 17 17:29:28 compute-0 nova_compute[186479]:   <features>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <acpi/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <apic/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <vmcoreinfo/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:   </features>
Feb 17 17:29:28 compute-0 nova_compute[186479]:   <clock offset="utc">
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <timer name="pit" tickpolicy="delay"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <timer name="hpet" present="no"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:   </clock>
Feb 17 17:29:28 compute-0 nova_compute[186479]:   <cpu mode="host-model" match="exact">
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <topology sockets="1" cores="1" threads="1"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:29:28 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <disk type="file" device="disk">
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/ee0a4091-de19-47a0-b8ea-04c5e762e3f0/disk"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <target dev="vda" bus="virtio"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <disk type="file" device="cdrom">
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <driver name="qemu" type="raw" cache="none"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/ee0a4091-de19-47a0-b8ea-04c5e762e3f0/disk.config"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <target dev="sda" bus="sata"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <interface type="ethernet">
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <mac address="fa:16:3e:77:e5:c3"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <driver name="vhost" rx_queue_size="512"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <mtu size="1442"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <target dev="tap661bf6ca-ac"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <serial type="pty">
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <log file="/var/lib/nova/instances/ee0a4091-de19-47a0-b8ea-04c5e762e3f0/console.log" append="off"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     </serial>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <video>
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     </video>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <input type="tablet" bus="usb"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <rng model="virtio">
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <backend model="random">/dev/urandom</backend>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <controller type="usb" index="0"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     <memballoon model="virtio">
Feb 17 17:29:28 compute-0 nova_compute[186479]:       <stats period="10"/>
Feb 17 17:29:28 compute-0 nova_compute[186479]:     </memballoon>
Feb 17 17:29:28 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:29:28 compute-0 nova_compute[186479]: </domain>
Feb 17 17:29:28 compute-0 nova_compute[186479]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.382 186483 DEBUG nova.compute.manager [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Preparing to wait for external event network-vif-plugged-661bf6ca-ac54-4a17-9eaa-c4b3303f33cb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.383 186483 DEBUG oslo_concurrency.lockutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.383 186483 DEBUG oslo_concurrency.lockutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.383 186483 DEBUG oslo_concurrency.lockutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.384 186483 DEBUG nova.virt.libvirt.vif [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:29:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1809346088',display_name='tempest-TestNetworkBasicOps-server-1809346088',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1809346088',id=2,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB5ULlRD2vQtmRfRh44x7/Sd0Ax8Q9hdx7xSs9HEKQbcmA+hsHRFEl04WH7D0mJhkCGdmQE0OoDqzOCJjz1VwLEyNiQb6SIbfN4NwvIbtMVknYAApCMcVWfRakX82Q6ukA==',key_name='tempest-TestNetworkBasicOps-579224632',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-oqhxv4c2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:29:24Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=ee0a4091-de19-47a0-b8ea-04c5e762e3f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "661bf6ca-ac54-4a17-9eaa-c4b3303f33cb", "address": "fa:16:3e:77:e5:c3", "network": {"id": "4b2e1182-92f8-44cc-8c9f-8f16dd796270", "bridge": "br-int", "label": "tempest-network-smoke--1074554491", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap661bf6ca-ac", "ovs_interfaceid": "661bf6ca-ac54-4a17-9eaa-c4b3303f33cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.385 186483 DEBUG nova.network.os_vif_util [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "661bf6ca-ac54-4a17-9eaa-c4b3303f33cb", "address": "fa:16:3e:77:e5:c3", "network": {"id": "4b2e1182-92f8-44cc-8c9f-8f16dd796270", "bridge": "br-int", "label": "tempest-network-smoke--1074554491", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap661bf6ca-ac", "ovs_interfaceid": "661bf6ca-ac54-4a17-9eaa-c4b3303f33cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.385 186483 DEBUG nova.network.os_vif_util [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:e5:c3,bridge_name='br-int',has_traffic_filtering=True,id=661bf6ca-ac54-4a17-9eaa-c4b3303f33cb,network=Network(4b2e1182-92f8-44cc-8c9f-8f16dd796270),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap661bf6ca-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.386 186483 DEBUG os_vif [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:e5:c3,bridge_name='br-int',has_traffic_filtering=True,id=661bf6ca-ac54-4a17-9eaa-c4b3303f33cb,network=Network(4b2e1182-92f8-44cc-8c9f-8f16dd796270),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap661bf6ca-ac') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.387 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.387 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.388 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.391 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.392 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap661bf6ca-ac, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.392 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap661bf6ca-ac, col_values=(('external_ids', {'iface-id': '661bf6ca-ac54-4a17-9eaa-c4b3303f33cb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:77:e5:c3', 'vm-uuid': 'ee0a4091-de19-47a0-b8ea-04c5e762e3f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.394 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:28 compute-0 NetworkManager[56323]: <info>  [1771349368.3954] manager: (tap661bf6ca-ac): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.398 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.401 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.404 186483 INFO os_vif [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:e5:c3,bridge_name='br-int',has_traffic_filtering=True,id=661bf6ca-ac54-4a17-9eaa-c4b3303f33cb,network=Network(4b2e1182-92f8-44cc-8c9f-8f16dd796270),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap661bf6ca-ac')
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.457 186483 DEBUG nova.virt.libvirt.driver [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.457 186483 DEBUG nova.virt.libvirt.driver [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.458 186483 DEBUG nova.virt.libvirt.driver [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No VIF found with MAC fa:16:3e:77:e5:c3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.459 186483 INFO nova.virt.libvirt.driver [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Using config drive
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.799 186483 INFO nova.virt.libvirt.driver [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Creating config drive at /var/lib/nova/instances/ee0a4091-de19-47a0-b8ea-04c5e762e3f0/disk.config
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.803 186483 DEBUG oslo_concurrency.processutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ee0a4091-de19-47a0-b8ea-04c5e762e3f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpmutm1zhg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.925 186483 DEBUG oslo_concurrency.processutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ee0a4091-de19-47a0-b8ea-04c5e762e3f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpmutm1zhg" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:29:28 compute-0 NetworkManager[56323]: <info>  [1771349368.9721] manager: (tap661bf6ca-ac): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Feb 17 17:29:28 compute-0 kernel: tap661bf6ca-ac: entered promiscuous mode
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.974 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:28 compute-0 ovn_controller[96568]: 2026-02-17T17:29:28Z|00034|binding|INFO|Claiming lport 661bf6ca-ac54-4a17-9eaa-c4b3303f33cb for this chassis.
Feb 17 17:29:28 compute-0 ovn_controller[96568]: 2026-02-17T17:29:28Z|00035|binding|INFO|661bf6ca-ac54-4a17-9eaa-c4b3303f33cb: Claiming fa:16:3e:77:e5:c3 10.100.0.24
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.977 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:28.987 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:e5:c3 10.100.0.24'], port_security=['fa:16:3e:77:e5:c3 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': 'ee0a4091-de19-47a0-b8ea-04c5e762e3f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b2e1182-92f8-44cc-8c9f-8f16dd796270', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5d5693ba-d788-4ad4-94a4-6095b9fc095b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3b866f6e-e180-40ae-835b-e7744abd8023, chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=661bf6ca-ac54-4a17-9eaa-c4b3303f33cb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:29:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:28.989 105898 INFO neutron.agent.ovn.metadata.agent [-] Port 661bf6ca-ac54-4a17-9eaa-c4b3303f33cb in datapath 4b2e1182-92f8-44cc-8c9f-8f16dd796270 bound to our chassis
Feb 17 17:29:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:28.990 105898 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b2e1182-92f8-44cc-8c9f-8f16dd796270
Feb 17 17:29:28 compute-0 ovn_controller[96568]: 2026-02-17T17:29:28Z|00036|binding|INFO|Setting lport 661bf6ca-ac54-4a17-9eaa-c4b3303f33cb ovn-installed in OVS
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.991 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:28 compute-0 ovn_controller[96568]: 2026-02-17T17:29:28Z|00037|binding|INFO|Setting lport 661bf6ca-ac54-4a17-9eaa-c4b3303f33cb up in Southbound
Feb 17 17:29:28 compute-0 nova_compute[186479]: 2026-02-17 17:29:28.992 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.003 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[b8287e0e-9c3a-4390-9c9b-1ce62ad6eb03]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.004 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4b2e1182-91 in ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 17 17:29:29 compute-0 systemd-machined[155877]: New machine qemu-2-instance-00000002.
Feb 17 17:29:29 compute-0 systemd-udevd[215505]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.006 215208 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4b2e1182-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.006 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[64ccfd15-3a6b-4770-90cd-9696aa1c7bab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.006 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[546aa05d-c3dd-45d6-b3d1-af8f7b955893]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:29 compute-0 NetworkManager[56323]: <info>  [1771349369.0179] device (tap661bf6ca-ac): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 17 17:29:29 compute-0 NetworkManager[56323]: <info>  [1771349369.0190] device (tap661bf6ca-ac): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 17 17:29:29 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.026 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[7e0ab27c-c87b-4db1-aca1-b3830b5340d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.055 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[19f8da29-139e-461a-83d9-c436c0028a8c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.080 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[7a78fd8d-23b2-4c1e-bee5-f87ae864b2f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.086 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[c96e5a33-b69f-4ef2-b4e8-1fdecb018b89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:29 compute-0 NetworkManager[56323]: <info>  [1771349369.0874] manager: (tap4b2e1182-90): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.113 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[957a5019-9150-4dcb-b39d-0d92e6c09179]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.115 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[3a547cb9-b519-4147-a8f9-1c541f03f866]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:29 compute-0 NetworkManager[56323]: <info>  [1771349369.1354] device (tap4b2e1182-90): carrier: link connected
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.139 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[8a16ff35-83fd-47b2-8959-fe0a3c55a773]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.155 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[0eff9b27-d882-4eea-92dd-843b15bb77de]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b2e1182-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:26:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 312444, 'reachable_time': 42938, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215538, 'error': None, 'target': 'ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.172 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[20da5231-d20f-4e21-8017-6570ab9bce24]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feba:26c1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 312444, 'tstamp': 312444}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215540, 'error': None, 'target': 'ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.187 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[2820f833-c57a-4783-8a86-a28debaed0cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b2e1182-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:26:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 312444, 'reachable_time': 42938, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215542, 'error': None, 'target': 'ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.212 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[990745c5-19f0-4a14-b17a-523755579bb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.258 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349369.2581282, ee0a4091-de19-47a0-b8ea-04c5e762e3f0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.258 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] VM Started (Lifecycle Event)
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.272 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[c868002f-7eab-4d45-ac4b-39d3569b5c1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.273 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b2e1182-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.274 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.274 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b2e1182-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.277 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.286 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349369.2607868, ee0a4091-de19-47a0-b8ea-04c5e762e3f0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.286 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] VM Paused (Lifecycle Event)
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.305 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.308 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:29:29 compute-0 kernel: tap4b2e1182-90: entered promiscuous mode
Feb 17 17:29:29 compute-0 NetworkManager[56323]: <info>  [1771349369.3139] manager: (tap4b2e1182-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.313 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.316 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.317 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b2e1182-90, col_values=(('external_ids', {'iface-id': '7723a283-3b18-4457-9a00-6a8dcec7523a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:29:29 compute-0 ovn_controller[96568]: 2026-02-17T17:29:29Z|00038|binding|INFO|Releasing lport 7723a283-3b18-4457-9a00-6a8dcec7523a from this chassis (sb_readonly=0)
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.319 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.319 105898 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4b2e1182-92f8-44cc-8c9f-8f16dd796270.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4b2e1182-92f8-44cc-8c9f-8f16dd796270.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.320 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[41af1865-96eb-47e2-8671-b8afc2a1d337]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.321 105898 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: global
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:     log         /dev/log local0 debug
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:     log-tag     haproxy-metadata-proxy-4b2e1182-92f8-44cc-8c9f-8f16dd796270
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:     user        root
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:     group       root
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:     maxconn     1024
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:     pidfile     /var/lib/neutron/external/pids/4b2e1182-92f8-44cc-8c9f-8f16dd796270.pid.haproxy
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:     daemon
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: defaults
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:     log global
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:     mode http
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:     option httplog
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:     option dontlognull
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:     option http-server-close
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:     option forwardfor
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:     retries                 3
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:     timeout http-request    30s
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:     timeout connect         30s
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:     timeout client          32s
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:     timeout server          32s
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:     timeout http-keep-alive 30s
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: listen listener
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:     bind 169.254.169.254:80
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:     server metadata /var/lib/neutron/metadata_proxy
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:     http-request add-header X-OVN-Network-ID 4b2e1182-92f8-44cc-8c9f-8f16dd796270
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 17 17:29:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:29.321 105898 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270', 'env', 'PROCESS_TAG=haproxy-4b2e1182-92f8-44cc-8c9f-8f16dd796270', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4b2e1182-92f8-44cc-8c9f-8f16dd796270.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.323 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.332 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.672 186483 DEBUG nova.network.neutron [req-c3c84e8a-9e66-4d52-8c26-d9b567b81cee req-e2efb91e-c78e-4492-8cba-4b4f8b7e7ab5 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Updated VIF entry in instance network info cache for port 661bf6ca-ac54-4a17-9eaa-c4b3303f33cb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.672 186483 DEBUG nova.network.neutron [req-c3c84e8a-9e66-4d52-8c26-d9b567b81cee req-e2efb91e-c78e-4492-8cba-4b4f8b7e7ab5 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Updating instance_info_cache with network_info: [{"id": "661bf6ca-ac54-4a17-9eaa-c4b3303f33cb", "address": "fa:16:3e:77:e5:c3", "network": {"id": "4b2e1182-92f8-44cc-8c9f-8f16dd796270", "bridge": "br-int", "label": "tempest-network-smoke--1074554491", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap661bf6ca-ac", "ovs_interfaceid": "661bf6ca-ac54-4a17-9eaa-c4b3303f33cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:29:29 compute-0 podman[215579]: 2026-02-17 17:29:29.684605512 +0000 UTC m=+0.063751553 container create 835dbc160a2ee063c55d55820af29813b2a3011d9d09a05b82884c81c80b6bbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.690 186483 DEBUG oslo_concurrency.lockutils [req-c3c84e8a-9e66-4d52-8c26-d9b567b81cee req-e2efb91e-c78e-4492-8cba-4b4f8b7e7ab5 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-ee0a4091-de19-47a0-b8ea-04c5e762e3f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:29:29 compute-0 systemd[1]: Started libpod-conmon-835dbc160a2ee063c55d55820af29813b2a3011d9d09a05b82884c81c80b6bbc.scope.
Feb 17 17:29:29 compute-0 systemd[1]: Started libcrun container.
Feb 17 17:29:29 compute-0 podman[215579]: 2026-02-17 17:29:29.65454875 +0000 UTC m=+0.033694871 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 17 17:29:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/134a8911090c261102ebd0c495be4f3428fd14c16a55caf3572dcbda897392f2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 17 17:29:29 compute-0 podman[215579]: 2026-02-17 17:29:29.76104462 +0000 UTC m=+0.140190731 container init 835dbc160a2ee063c55d55820af29813b2a3011d9d09a05b82884c81c80b6bbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 17 17:29:29 compute-0 podman[215579]: 2026-02-17 17:29:29.765159349 +0000 UTC m=+0.144305420 container start 835dbc160a2ee063c55d55820af29813b2a3011d9d09a05b82884c81c80b6bbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:29:29 compute-0 neutron-haproxy-ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270[215594]: [NOTICE]   (215598) : New worker (215600) forked
Feb 17 17:29:29 compute-0 neutron-haproxy-ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270[215594]: [NOTICE]   (215598) : Loading success.
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.818 186483 DEBUG nova.compute.manager [req-cecddab2-906c-4be9-9b0b-daca9c2080db req-14c4cf44-c405-4282-a030-c1f38f0d7266 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Received event network-vif-plugged-661bf6ca-ac54-4a17-9eaa-c4b3303f33cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.818 186483 DEBUG oslo_concurrency.lockutils [req-cecddab2-906c-4be9-9b0b-daca9c2080db req-14c4cf44-c405-4282-a030-c1f38f0d7266 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.818 186483 DEBUG oslo_concurrency.lockutils [req-cecddab2-906c-4be9-9b0b-daca9c2080db req-14c4cf44-c405-4282-a030-c1f38f0d7266 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.819 186483 DEBUG oslo_concurrency.lockutils [req-cecddab2-906c-4be9-9b0b-daca9c2080db req-14c4cf44-c405-4282-a030-c1f38f0d7266 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.819 186483 DEBUG nova.compute.manager [req-cecddab2-906c-4be9-9b0b-daca9c2080db req-14c4cf44-c405-4282-a030-c1f38f0d7266 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Processing event network-vif-plugged-661bf6ca-ac54-4a17-9eaa-c4b3303f33cb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.820 186483 DEBUG nova.compute.manager [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.823 186483 DEBUG nova.virt.libvirt.driver [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.824 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349369.8236504, ee0a4091-de19-47a0-b8ea-04c5e762e3f0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.824 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] VM Resumed (Lifecycle Event)
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.829 186483 INFO nova.virt.libvirt.driver [-] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Instance spawned successfully.
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.829 186483 DEBUG nova.virt.libvirt.driver [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.854 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.858 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.862 186483 DEBUG nova.virt.libvirt.driver [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.862 186483 DEBUG nova.virt.libvirt.driver [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.862 186483 DEBUG nova.virt.libvirt.driver [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.863 186483 DEBUG nova.virt.libvirt.driver [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.863 186483 DEBUG nova.virt.libvirt.driver [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.864 186483 DEBUG nova.virt.libvirt.driver [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.892 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.928 186483 INFO nova.compute.manager [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Took 5.06 seconds to spawn the instance on the hypervisor.
Feb 17 17:29:29 compute-0 nova_compute[186479]: 2026-02-17 17:29:29.928 186483 DEBUG nova.compute.manager [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:29:30 compute-0 nova_compute[186479]: 2026-02-17 17:29:30.005 186483 INFO nova.compute.manager [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Took 5.52 seconds to build instance.
Feb 17 17:29:30 compute-0 nova_compute[186479]: 2026-02-17 17:29:30.022 186483 DEBUG oslo_concurrency.lockutils [None req-15d0f33c-8774-44c4-b2ce-3f704fe513d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.599s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:31 compute-0 podman[215609]: 2026-02-17 17:29:31.742844362 +0000 UTC m=+0.083156000 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:29:31 compute-0 nova_compute[186479]: 2026-02-17 17:29:31.925 186483 DEBUG nova.compute.manager [req-35fbaf51-4ccd-43a4-a23e-8c3c35abf910 req-e5b57dcb-3a05-4f21-8bed-d303e17c2e77 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Received event network-vif-plugged-661bf6ca-ac54-4a17-9eaa-c4b3303f33cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:29:31 compute-0 nova_compute[186479]: 2026-02-17 17:29:31.925 186483 DEBUG oslo_concurrency.lockutils [req-35fbaf51-4ccd-43a4-a23e-8c3c35abf910 req-e5b57dcb-3a05-4f21-8bed-d303e17c2e77 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:31 compute-0 nova_compute[186479]: 2026-02-17 17:29:31.925 186483 DEBUG oslo_concurrency.lockutils [req-35fbaf51-4ccd-43a4-a23e-8c3c35abf910 req-e5b57dcb-3a05-4f21-8bed-d303e17c2e77 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:31 compute-0 nova_compute[186479]: 2026-02-17 17:29:31.926 186483 DEBUG oslo_concurrency.lockutils [req-35fbaf51-4ccd-43a4-a23e-8c3c35abf910 req-e5b57dcb-3a05-4f21-8bed-d303e17c2e77 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:31 compute-0 nova_compute[186479]: 2026-02-17 17:29:31.927 186483 DEBUG nova.compute.manager [req-35fbaf51-4ccd-43a4-a23e-8c3c35abf910 req-e5b57dcb-3a05-4f21-8bed-d303e17c2e77 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] No waiting events found dispatching network-vif-plugged-661bf6ca-ac54-4a17-9eaa-c4b3303f33cb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:29:31 compute-0 nova_compute[186479]: 2026-02-17 17:29:31.927 186483 WARNING nova.compute.manager [req-35fbaf51-4ccd-43a4-a23e-8c3c35abf910 req-e5b57dcb-3a05-4f21-8bed-d303e17c2e77 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Received unexpected event network-vif-plugged-661bf6ca-ac54-4a17-9eaa-c4b3303f33cb for instance with vm_state active and task_state None.
Feb 17 17:29:33 compute-0 nova_compute[186479]: 2026-02-17 17:29:33.424 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:33 compute-0 nova_compute[186479]: 2026-02-17 17:29:33.426 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:29:33 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:33.768 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '7a:bf:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:a1:c8:7d:f2:63'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:29:33 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:33.770 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 17 17:29:33 compute-0 nova_compute[186479]: 2026-02-17 17:29:33.770 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:34 compute-0 podman[215633]: 2026-02-17 17:29:34.754310783 +0000 UTC m=+0.094498212 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 17 17:29:38 compute-0 nova_compute[186479]: 2026-02-17 17:29:38.427 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:40 compute-0 podman[215664]: 2026-02-17 17:29:40.73795151 +0000 UTC m=+0.068603830 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, maintainer=Red Hat, Inc., release=1770267347, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, managed_by=edpm_ansible, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64)
Feb 17 17:29:42 compute-0 sshd-session[215691]: Invalid user admin from 209.38.233.161 port 56796
Feb 17 17:29:42 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:42.772 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ee0cee2f-3200-4f1f-8903-57b18789347d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:29:42 compute-0 sshd-session[215691]: Connection closed by invalid user admin 209.38.233.161 port 56796 [preauth]
Feb 17 17:29:42 compute-0 ovn_controller[96568]: 2026-02-17T17:29:42Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:77:e5:c3 10.100.0.24
Feb 17 17:29:42 compute-0 ovn_controller[96568]: 2026-02-17T17:29:42Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:77:e5:c3 10.100.0.24
Feb 17 17:29:43 compute-0 nova_compute[186479]: 2026-02-17 17:29:43.428 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:29:43 compute-0 nova_compute[186479]: 2026-02-17 17:29:43.430 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:43 compute-0 nova_compute[186479]: 2026-02-17 17:29:43.430 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:29:43 compute-0 nova_compute[186479]: 2026-02-17 17:29:43.430 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:29:43 compute-0 nova_compute[186479]: 2026-02-17 17:29:43.430 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:29:43 compute-0 nova_compute[186479]: 2026-02-17 17:29:43.432 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:48 compute-0 nova_compute[186479]: 2026-02-17 17:29:48.432 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:49 compute-0 podman[215694]: 2026-02-17 17:29:49.731150214 +0000 UTC m=+0.059290267 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 17 17:29:49 compute-0 podman[215693]: 2026-02-17 17:29:49.737014235 +0000 UTC m=+0.063338723 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 17 17:29:49 compute-0 nova_compute[186479]: 2026-02-17 17:29:49.772 186483 DEBUG oslo_concurrency.lockutils [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:49 compute-0 nova_compute[186479]: 2026-02-17 17:29:49.773 186483 DEBUG oslo_concurrency.lockutils [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:49 compute-0 nova_compute[186479]: 2026-02-17 17:29:49.773 186483 DEBUG oslo_concurrency.lockutils [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:49 compute-0 nova_compute[186479]: 2026-02-17 17:29:49.774 186483 DEBUG oslo_concurrency.lockutils [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:49 compute-0 nova_compute[186479]: 2026-02-17 17:29:49.774 186483 DEBUG oslo_concurrency.lockutils [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:49 compute-0 nova_compute[186479]: 2026-02-17 17:29:49.777 186483 INFO nova.compute.manager [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Terminating instance
Feb 17 17:29:49 compute-0 nova_compute[186479]: 2026-02-17 17:29:49.778 186483 DEBUG nova.compute.manager [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 17 17:29:49 compute-0 kernel: tap661bf6ca-ac (unregistering): left promiscuous mode
Feb 17 17:29:49 compute-0 NetworkManager[56323]: <info>  [1771349389.8236] device (tap661bf6ca-ac): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 17 17:29:49 compute-0 nova_compute[186479]: 2026-02-17 17:29:49.831 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:49 compute-0 ovn_controller[96568]: 2026-02-17T17:29:49Z|00039|binding|INFO|Releasing lport 661bf6ca-ac54-4a17-9eaa-c4b3303f33cb from this chassis (sb_readonly=0)
Feb 17 17:29:49 compute-0 ovn_controller[96568]: 2026-02-17T17:29:49Z|00040|binding|INFO|Setting lport 661bf6ca-ac54-4a17-9eaa-c4b3303f33cb down in Southbound
Feb 17 17:29:49 compute-0 ovn_controller[96568]: 2026-02-17T17:29:49Z|00041|binding|INFO|Removing iface tap661bf6ca-ac ovn-installed in OVS
Feb 17 17:29:49 compute-0 nova_compute[186479]: 2026-02-17 17:29:49.834 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:49.840 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:e5:c3 10.100.0.24'], port_security=['fa:16:3e:77:e5:c3 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': 'ee0a4091-de19-47a0-b8ea-04c5e762e3f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b2e1182-92f8-44cc-8c9f-8f16dd796270', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5d5693ba-d788-4ad4-94a4-6095b9fc095b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3b866f6e-e180-40ae-835b-e7744abd8023, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=661bf6ca-ac54-4a17-9eaa-c4b3303f33cb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:29:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:49.841 105898 INFO neutron.agent.ovn.metadata.agent [-] Port 661bf6ca-ac54-4a17-9eaa-c4b3303f33cb in datapath 4b2e1182-92f8-44cc-8c9f-8f16dd796270 unbound from our chassis
Feb 17 17:29:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:49.842 105898 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4b2e1182-92f8-44cc-8c9f-8f16dd796270, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 17 17:29:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:49.843 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[6c0dbcac-527d-4c34-a1e1-55088594a0c0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:49.843 105898 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270 namespace which is not needed anymore
Feb 17 17:29:49 compute-0 nova_compute[186479]: 2026-02-17 17:29:49.846 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:49 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Feb 17 17:29:49 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 13.082s CPU time.
Feb 17 17:29:49 compute-0 systemd-machined[155877]: Machine qemu-2-instance-00000002 terminated.
Feb 17 17:29:49 compute-0 neutron-haproxy-ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270[215594]: [NOTICE]   (215598) : haproxy version is 2.8.14-c23fe91
Feb 17 17:29:49 compute-0 neutron-haproxy-ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270[215594]: [NOTICE]   (215598) : path to executable is /usr/sbin/haproxy
Feb 17 17:29:49 compute-0 neutron-haproxy-ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270[215594]: [WARNING]  (215598) : Exiting Master process...
Feb 17 17:29:49 compute-0 neutron-haproxy-ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270[215594]: [ALERT]    (215598) : Current worker (215600) exited with code 143 (Terminated)
Feb 17 17:29:49 compute-0 neutron-haproxy-ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270[215594]: [WARNING]  (215598) : All workers exited. Exiting... (0)
Feb 17 17:29:49 compute-0 systemd[1]: libpod-835dbc160a2ee063c55d55820af29813b2a3011d9d09a05b82884c81c80b6bbc.scope: Deactivated successfully.
Feb 17 17:29:49 compute-0 podman[215758]: 2026-02-17 17:29:49.980681101 +0000 UTC m=+0.043036145 container died 835dbc160a2ee063c55d55820af29813b2a3011d9d09a05b82884c81c80b6bbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Feb 17 17:29:49 compute-0 nova_compute[186479]: 2026-02-17 17:29:49.991 186483 DEBUG nova.compute.manager [req-2bf91a10-7bb8-4826-8bf2-a6b302389cc9 req-e54f2146-2dd5-45f7-bcb6-719bfc773697 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Received event network-vif-unplugged-661bf6ca-ac54-4a17-9eaa-c4b3303f33cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:29:49 compute-0 nova_compute[186479]: 2026-02-17 17:29:49.992 186483 DEBUG oslo_concurrency.lockutils [req-2bf91a10-7bb8-4826-8bf2-a6b302389cc9 req-e54f2146-2dd5-45f7-bcb6-719bfc773697 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:49 compute-0 nova_compute[186479]: 2026-02-17 17:29:49.992 186483 DEBUG oslo_concurrency.lockutils [req-2bf91a10-7bb8-4826-8bf2-a6b302389cc9 req-e54f2146-2dd5-45f7-bcb6-719bfc773697 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:49 compute-0 nova_compute[186479]: 2026-02-17 17:29:49.992 186483 DEBUG oslo_concurrency.lockutils [req-2bf91a10-7bb8-4826-8bf2-a6b302389cc9 req-e54f2146-2dd5-45f7-bcb6-719bfc773697 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:49 compute-0 nova_compute[186479]: 2026-02-17 17:29:49.993 186483 DEBUG nova.compute.manager [req-2bf91a10-7bb8-4826-8bf2-a6b302389cc9 req-e54f2146-2dd5-45f7-bcb6-719bfc773697 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] No waiting events found dispatching network-vif-unplugged-661bf6ca-ac54-4a17-9eaa-c4b3303f33cb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:29:49 compute-0 nova_compute[186479]: 2026-02-17 17:29:49.993 186483 DEBUG nova.compute.manager [req-2bf91a10-7bb8-4826-8bf2-a6b302389cc9 req-e54f2146-2dd5-45f7-bcb6-719bfc773697 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Received event network-vif-unplugged-661bf6ca-ac54-4a17-9eaa-c4b3303f33cb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.002 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.006 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:50 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-835dbc160a2ee063c55d55820af29813b2a3011d9d09a05b82884c81c80b6bbc-userdata-shm.mount: Deactivated successfully.
Feb 17 17:29:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-134a8911090c261102ebd0c495be4f3428fd14c16a55caf3572dcbda897392f2-merged.mount: Deactivated successfully.
Feb 17 17:29:50 compute-0 podman[215758]: 2026-02-17 17:29:50.023628994 +0000 UTC m=+0.085984038 container cleanup 835dbc160a2ee063c55d55820af29813b2a3011d9d09a05b82884c81c80b6bbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 17 17:29:50 compute-0 systemd[1]: libpod-conmon-835dbc160a2ee063c55d55820af29813b2a3011d9d09a05b82884c81c80b6bbc.scope: Deactivated successfully.
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.035 186483 INFO nova.virt.libvirt.driver [-] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Instance destroyed successfully.
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.036 186483 DEBUG nova.objects.instance [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'resources' on Instance uuid ee0a4091-de19-47a0-b8ea-04c5e762e3f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.048 186483 DEBUG nova.virt.libvirt.vif [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:29:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1809346088',display_name='tempest-TestNetworkBasicOps-server-1809346088',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1809346088',id=2,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB5ULlRD2vQtmRfRh44x7/Sd0Ax8Q9hdx7xSs9HEKQbcmA+hsHRFEl04WH7D0mJhkCGdmQE0OoDqzOCJjz1VwLEyNiQb6SIbfN4NwvIbtMVknYAApCMcVWfRakX82Q6ukA==',key_name='tempest-TestNetworkBasicOps-579224632',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:29:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-oqhxv4c2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:29:29Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=ee0a4091-de19-47a0-b8ea-04c5e762e3f0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "661bf6ca-ac54-4a17-9eaa-c4b3303f33cb", "address": "fa:16:3e:77:e5:c3", "network": {"id": "4b2e1182-92f8-44cc-8c9f-8f16dd796270", "bridge": "br-int", "label": "tempest-network-smoke--1074554491", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap661bf6ca-ac", "ovs_interfaceid": "661bf6ca-ac54-4a17-9eaa-c4b3303f33cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.048 186483 DEBUG nova.network.os_vif_util [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "661bf6ca-ac54-4a17-9eaa-c4b3303f33cb", "address": "fa:16:3e:77:e5:c3", "network": {"id": "4b2e1182-92f8-44cc-8c9f-8f16dd796270", "bridge": "br-int", "label": "tempest-network-smoke--1074554491", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap661bf6ca-ac", "ovs_interfaceid": "661bf6ca-ac54-4a17-9eaa-c4b3303f33cb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.049 186483 DEBUG nova.network.os_vif_util [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:e5:c3,bridge_name='br-int',has_traffic_filtering=True,id=661bf6ca-ac54-4a17-9eaa-c4b3303f33cb,network=Network(4b2e1182-92f8-44cc-8c9f-8f16dd796270),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap661bf6ca-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.049 186483 DEBUG os_vif [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:e5:c3,bridge_name='br-int',has_traffic_filtering=True,id=661bf6ca-ac54-4a17-9eaa-c4b3303f33cb,network=Network(4b2e1182-92f8-44cc-8c9f-8f16dd796270),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap661bf6ca-ac') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.052 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.052 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap661bf6ca-ac, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.054 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.056 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.059 186483 INFO os_vif [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:e5:c3,bridge_name='br-int',has_traffic_filtering=True,id=661bf6ca-ac54-4a17-9eaa-c4b3303f33cb,network=Network(4b2e1182-92f8-44cc-8c9f-8f16dd796270),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap661bf6ca-ac')
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.060 186483 INFO nova.virt.libvirt.driver [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Deleting instance files /var/lib/nova/instances/ee0a4091-de19-47a0-b8ea-04c5e762e3f0_del
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.060 186483 INFO nova.virt.libvirt.driver [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Deletion of /var/lib/nova/instances/ee0a4091-de19-47a0-b8ea-04c5e762e3f0_del complete
Feb 17 17:29:50 compute-0 podman[215805]: 2026-02-17 17:29:50.090003949 +0000 UTC m=+0.044218273 container remove 835dbc160a2ee063c55d55820af29813b2a3011d9d09a05b82884c81c80b6bbc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Feb 17 17:29:50 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:50.094 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[fc900994-ac82-4514-96fd-c3eba530f277]: (4, ('Tue Feb 17 05:29:49 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270 (835dbc160a2ee063c55d55820af29813b2a3011d9d09a05b82884c81c80b6bbc)\n835dbc160a2ee063c55d55820af29813b2a3011d9d09a05b82884c81c80b6bbc\nTue Feb 17 05:29:50 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270 (835dbc160a2ee063c55d55820af29813b2a3011d9d09a05b82884c81c80b6bbc)\n835dbc160a2ee063c55d55820af29813b2a3011d9d09a05b82884c81c80b6bbc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:50 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:50.096 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[265fce51-c660-47fc-a122-0555f6f4fcd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:50 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:50.097 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b2e1182-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.099 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:50 compute-0 kernel: tap4b2e1182-90: left promiscuous mode
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.106 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:50 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:50.109 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[f8ac9a99-d34a-47d0-807e-5303b9866763]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.120 186483 DEBUG nova.virt.libvirt.host [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.121 186483 INFO nova.virt.libvirt.host [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] UEFI support detected
Feb 17 17:29:50 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:50.120 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[3df4b6c8-a344-4c77-9044-04a590863d70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.122 186483 INFO nova.compute.manager [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Took 0.34 seconds to destroy the instance on the hypervisor.
Feb 17 17:29:50 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:50.122 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[84332a91-f554-4708-a360-97af658c27e9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.122 186483 DEBUG oslo.service.loopingcall [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.123 186483 DEBUG nova.compute.manager [-] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.123 186483 DEBUG nova.network.neutron [-] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 17 17:29:50 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:50.138 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[5efaa732-ac33-4ca6-8172-79188b506aa5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 312438, 'reachable_time': 38611, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215820, 'error': None, 'target': 'ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:50 compute-0 systemd[1]: run-netns-ovnmeta\x2d4b2e1182\x2d92f8\x2d44cc\x2d8c9f\x2d8f16dd796270.mount: Deactivated successfully.
Feb 17 17:29:50 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:50.147 106424 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4b2e1182-92f8-44cc-8c9f-8f16dd796270 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 17 17:29:50 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:50.148 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[66a010c7-53ac-4247-9422-28e688d01243]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.773 186483 DEBUG nova.network.neutron [-] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.794 186483 INFO nova.compute.manager [-] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Took 0.67 seconds to deallocate network for instance.
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.840 186483 DEBUG oslo_concurrency.lockutils [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.841 186483 DEBUG oslo_concurrency.lockutils [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.861 186483 DEBUG nova.compute.manager [req-498361e9-0198-4ca5-94ea-caa58832856a req-c18e819d-262f-43e1-aa59-12bede355079 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Received event network-vif-deleted-661bf6ca-ac54-4a17-9eaa-c4b3303f33cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.931 186483 DEBUG nova.compute.provider_tree [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.947 186483 DEBUG nova.scheduler.client.report [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.967 186483 DEBUG oslo_concurrency.lockutils [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:50 compute-0 nova_compute[186479]: 2026-02-17 17:29:50.990 186483 INFO nova.scheduler.client.report [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Deleted allocations for instance ee0a4091-de19-47a0-b8ea-04c5e762e3f0
Feb 17 17:29:51 compute-0 nova_compute[186479]: 2026-02-17 17:29:51.070 186483 DEBUG oslo_concurrency.lockutils [None req-c28159e2-1cf3-4b1e-86b8-7895ee989ff4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.297s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:52 compute-0 nova_compute[186479]: 2026-02-17 17:29:52.053 186483 DEBUG nova.compute.manager [req-2d105627-10e6-46d0-a861-9e616c352266 req-3c0c2b88-e763-4775-9a5d-5ce36a28b13a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Received event network-vif-plugged-661bf6ca-ac54-4a17-9eaa-c4b3303f33cb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:29:52 compute-0 nova_compute[186479]: 2026-02-17 17:29:52.053 186483 DEBUG oslo_concurrency.lockutils [req-2d105627-10e6-46d0-a861-9e616c352266 req-3c0c2b88-e763-4775-9a5d-5ce36a28b13a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:52 compute-0 nova_compute[186479]: 2026-02-17 17:29:52.053 186483 DEBUG oslo_concurrency.lockutils [req-2d105627-10e6-46d0-a861-9e616c352266 req-3c0c2b88-e763-4775-9a5d-5ce36a28b13a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:52 compute-0 nova_compute[186479]: 2026-02-17 17:29:52.054 186483 DEBUG oslo_concurrency.lockutils [req-2d105627-10e6-46d0-a861-9e616c352266 req-3c0c2b88-e763-4775-9a5d-5ce36a28b13a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "ee0a4091-de19-47a0-b8ea-04c5e762e3f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:52 compute-0 nova_compute[186479]: 2026-02-17 17:29:52.054 186483 DEBUG nova.compute.manager [req-2d105627-10e6-46d0-a861-9e616c352266 req-3c0c2b88-e763-4775-9a5d-5ce36a28b13a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] No waiting events found dispatching network-vif-plugged-661bf6ca-ac54-4a17-9eaa-c4b3303f33cb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:29:52 compute-0 nova_compute[186479]: 2026-02-17 17:29:52.054 186483 WARNING nova.compute.manager [req-2d105627-10e6-46d0-a861-9e616c352266 req-3c0c2b88-e763-4775-9a5d-5ce36a28b13a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Received unexpected event network-vif-plugged-661bf6ca-ac54-4a17-9eaa-c4b3303f33cb for instance with vm_state deleted and task_state None.
Feb 17 17:29:53 compute-0 nova_compute[186479]: 2026-02-17 17:29:53.435 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:53 compute-0 ovn_controller[96568]: 2026-02-17T17:29:53Z|00042|binding|INFO|Releasing lport 12e253c0-47eb-4a49-a8cd-f30bfe93bab7 from this chassis (sb_readonly=0)
Feb 17 17:29:53 compute-0 nova_compute[186479]: 2026-02-17 17:29:53.497 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:53 compute-0 podman[215822]: 2026-02-17 17:29:53.729568508 +0000 UTC m=+0.068842416 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.474 186483 DEBUG nova.compute.manager [req-86de09f7-6d8f-4431-90e9-61e99b19ec0e req-3fa34cdc-7ec1-47ae-82ec-48e97d562263 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Received event network-changed-32811792-c388-4fbf-ae6d-43c1e4b213bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.475 186483 DEBUG nova.compute.manager [req-86de09f7-6d8f-4431-90e9-61e99b19ec0e req-3fa34cdc-7ec1-47ae-82ec-48e97d562263 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Refreshing instance network info cache due to event network-changed-32811792-c388-4fbf-ae6d-43c1e4b213bf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.475 186483 DEBUG oslo_concurrency.lockutils [req-86de09f7-6d8f-4431-90e9-61e99b19ec0e req-3fa34cdc-7ec1-47ae-82ec-48e97d562263 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.476 186483 DEBUG oslo_concurrency.lockutils [req-86de09f7-6d8f-4431-90e9-61e99b19ec0e req-3fa34cdc-7ec1-47ae-82ec-48e97d562263 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.476 186483 DEBUG nova.network.neutron [req-86de09f7-6d8f-4431-90e9-61e99b19ec0e req-3fa34cdc-7ec1-47ae-82ec-48e97d562263 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Refreshing network info cache for port 32811792-c388-4fbf-ae6d-43c1e4b213bf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.580 186483 DEBUG oslo_concurrency.lockutils [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.581 186483 DEBUG oslo_concurrency.lockutils [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.582 186483 DEBUG oslo_concurrency.lockutils [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.582 186483 DEBUG oslo_concurrency.lockutils [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.583 186483 DEBUG oslo_concurrency.lockutils [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.584 186483 INFO nova.compute.manager [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Terminating instance
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.586 186483 DEBUG nova.compute.manager [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 17 17:29:54 compute-0 kernel: tap32811792-c3 (unregistering): left promiscuous mode
Feb 17 17:29:54 compute-0 NetworkManager[56323]: <info>  [1771349394.6133] device (tap32811792-c3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.616 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:54 compute-0 ovn_controller[96568]: 2026-02-17T17:29:54Z|00043|binding|INFO|Releasing lport 32811792-c388-4fbf-ae6d-43c1e4b213bf from this chassis (sb_readonly=0)
Feb 17 17:29:54 compute-0 ovn_controller[96568]: 2026-02-17T17:29:54Z|00044|binding|INFO|Setting lport 32811792-c388-4fbf-ae6d-43c1e4b213bf down in Southbound
Feb 17 17:29:54 compute-0 ovn_controller[96568]: 2026-02-17T17:29:54Z|00045|binding|INFO|Removing iface tap32811792-c3 ovn-installed in OVS
Feb 17 17:29:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:54.623 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:63:df 10.100.0.10'], port_security=['fa:16:3e:1f:63:df 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-669651c7-5ac9-4d2e-a8a1-366ce4dcd584', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '05969d3c-be24-4ef7-ba4b-be8a5826cb11', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=022c023e-9b10-4231-98a6-e0cad7ebafd6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=32811792-c388-4fbf-ae6d-43c1e4b213bf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:29:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:54.624 105898 INFO neutron.agent.ovn.metadata.agent [-] Port 32811792-c388-4fbf-ae6d-43c1e4b213bf in datapath 669651c7-5ac9-4d2e-a8a1-366ce4dcd584 unbound from our chassis
Feb 17 17:29:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:54.625 105898 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 669651c7-5ac9-4d2e-a8a1-366ce4dcd584, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 17 17:29:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:54.626 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[0df18df2-31bd-4efb-833f-c4794b501235]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.627 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:54.627 105898 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584 namespace which is not needed anymore
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.628 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:54 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Feb 17 17:29:54 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 13.587s CPU time.
Feb 17 17:29:54 compute-0 systemd-machined[155877]: Machine qemu-1-instance-00000001 terminated.
Feb 17 17:29:54 compute-0 neutron-haproxy-ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584[215330]: [NOTICE]   (215334) : haproxy version is 2.8.14-c23fe91
Feb 17 17:29:54 compute-0 neutron-haproxy-ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584[215330]: [NOTICE]   (215334) : path to executable is /usr/sbin/haproxy
Feb 17 17:29:54 compute-0 neutron-haproxy-ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584[215330]: [WARNING]  (215334) : Exiting Master process...
Feb 17 17:29:54 compute-0 neutron-haproxy-ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584[215330]: [ALERT]    (215334) : Current worker (215336) exited with code 143 (Terminated)
Feb 17 17:29:54 compute-0 neutron-haproxy-ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584[215330]: [WARNING]  (215334) : All workers exited. Exiting... (0)
Feb 17 17:29:54 compute-0 systemd[1]: libpod-1fd213c7e3921594f949833679fc1087459c933da6358dbe66bbb7e5181b4419.scope: Deactivated successfully.
Feb 17 17:29:54 compute-0 podman[215872]: 2026-02-17 17:29:54.777488374 +0000 UTC m=+0.053306042 container died 1fd213c7e3921594f949833679fc1087459c933da6358dbe66bbb7e5181b4419 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 17 17:29:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1fd213c7e3921594f949833679fc1087459c933da6358dbe66bbb7e5181b4419-userdata-shm.mount: Deactivated successfully.
Feb 17 17:29:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-4e87618e7a10dc4b2dbbeacab27c93c2cfd99523cf1f9a54c660c9b26419c9db-merged.mount: Deactivated successfully.
Feb 17 17:29:54 compute-0 podman[215872]: 2026-02-17 17:29:54.819267269 +0000 UTC m=+0.095084927 container cleanup 1fd213c7e3921594f949833679fc1087459c933da6358dbe66bbb7e5181b4419 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 17 17:29:54 compute-0 systemd[1]: libpod-conmon-1fd213c7e3921594f949833679fc1087459c933da6358dbe66bbb7e5181b4419.scope: Deactivated successfully.
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.845 186483 INFO nova.virt.libvirt.driver [-] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Instance destroyed successfully.
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.846 186483 DEBUG nova.objects.instance [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'resources' on Instance uuid 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.858 186483 DEBUG nova.virt.libvirt.vif [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:28:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-572755466',display_name='tempest-TestNetworkBasicOps-server-572755466',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-572755466',id=1,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDujMpTWlrR3Ei876npo995e/qq2Qz02Rp1wCi/x8Ta1eJB1HmxKd48/OhgRcLr6r+tp+DDPpmJ2LCpL/r9Wcf04pbpqIidfhknfTDr1PIqBOZQW02EVQTDBpRV4oH7OBA==',key_name='tempest-TestNetworkBasicOps-599335326',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:28:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-birwp627',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:28:57Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "address": "fa:16:3e:1f:63:df", "network": {"id": "669651c7-5ac9-4d2e-a8a1-366ce4dcd584", "bridge": "br-int", "label": "tempest-network-smoke--1906160902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32811792-c3", "ovs_interfaceid": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.859 186483 DEBUG nova.network.os_vif_util [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "address": "fa:16:3e:1f:63:df", "network": {"id": "669651c7-5ac9-4d2e-a8a1-366ce4dcd584", "bridge": "br-int", "label": "tempest-network-smoke--1906160902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32811792-c3", "ovs_interfaceid": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.859 186483 DEBUG nova.network.os_vif_util [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1f:63:df,bridge_name='br-int',has_traffic_filtering=True,id=32811792-c388-4fbf-ae6d-43c1e4b213bf,network=Network(669651c7-5ac9-4d2e-a8a1-366ce4dcd584),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32811792-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.860 186483 DEBUG os_vif [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:63:df,bridge_name='br-int',has_traffic_filtering=True,id=32811792-c388-4fbf-ae6d-43c1e4b213bf,network=Network(669651c7-5ac9-4d2e-a8a1-366ce4dcd584),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32811792-c3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.861 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.862 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap32811792-c3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.865 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.867 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.871 186483 INFO os_vif [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:63:df,bridge_name='br-int',has_traffic_filtering=True,id=32811792-c388-4fbf-ae6d-43c1e4b213bf,network=Network(669651c7-5ac9-4d2e-a8a1-366ce4dcd584),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32811792-c3')
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.872 186483 INFO nova.virt.libvirt.driver [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Deleting instance files /var/lib/nova/instances/8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa_del
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.873 186483 INFO nova.virt.libvirt.driver [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Deletion of /var/lib/nova/instances/8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa_del complete
Feb 17 17:29:54 compute-0 podman[215912]: 2026-02-17 17:29:54.890012969 +0000 UTC m=+0.044810938 container remove 1fd213c7e3921594f949833679fc1087459c933da6358dbe66bbb7e5181b4419 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 17 17:29:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:54.895 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[9c777ba7-3489-4e75-b11e-5b1964f7f3b6]: (4, ('Tue Feb 17 05:29:54 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584 (1fd213c7e3921594f949833679fc1087459c933da6358dbe66bbb7e5181b4419)\n1fd213c7e3921594f949833679fc1087459c933da6358dbe66bbb7e5181b4419\nTue Feb 17 05:29:54 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584 (1fd213c7e3921594f949833679fc1087459c933da6358dbe66bbb7e5181b4419)\n1fd213c7e3921594f949833679fc1087459c933da6358dbe66bbb7e5181b4419\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:54.897 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[31796088-a70c-48ca-8540-8f1bf7e69070]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:54.898 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap669651c7-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.900 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:54 compute-0 kernel: tap669651c7-50: left promiscuous mode
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.905 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.906 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:54.908 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[694c215f-6425-47ab-9649-a34dec969e18]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.917 186483 INFO nova.compute.manager [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Took 0.33 seconds to destroy the instance on the hypervisor.
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.918 186483 DEBUG oslo.service.loopingcall [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.918 186483 DEBUG nova.compute.manager [-] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 17 17:29:54 compute-0 nova_compute[186479]: 2026-02-17 17:29:54.919 186483 DEBUG nova.network.neutron [-] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 17 17:29:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:54.926 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[c9af4bce-5f65-4e0a-9a3c-6f8bd0c18691]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:54.928 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[06845622-f4ae-4872-99e0-a00b92193fab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:54.940 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[999d3732-ad55-4a9b-a713-b0876df89ac1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 309565, 'reachable_time': 42970, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215930, 'error': None, 'target': 'ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:54.942 106424 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-669651c7-5ac9-4d2e-a8a1-366ce4dcd584 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 17 17:29:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:29:54.942 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[e24f8255-dc14-40ef-a60a-2039cb3e835e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:29:54 compute-0 systemd[1]: run-netns-ovnmeta\x2d669651c7\x2d5ac9\x2d4d2e\x2da8a1\x2d366ce4dcd584.mount: Deactivated successfully.
Feb 17 17:29:55 compute-0 nova_compute[186479]: 2026-02-17 17:29:55.865 186483 DEBUG nova.network.neutron [-] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:29:55 compute-0 nova_compute[186479]: 2026-02-17 17:29:55.894 186483 INFO nova.compute.manager [-] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Took 0.98 seconds to deallocate network for instance.
Feb 17 17:29:55 compute-0 nova_compute[186479]: 2026-02-17 17:29:55.938 186483 DEBUG nova.network.neutron [req-86de09f7-6d8f-4431-90e9-61e99b19ec0e req-3fa34cdc-7ec1-47ae-82ec-48e97d562263 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Updated VIF entry in instance network info cache for port 32811792-c388-4fbf-ae6d-43c1e4b213bf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:29:55 compute-0 nova_compute[186479]: 2026-02-17 17:29:55.938 186483 DEBUG nova.network.neutron [req-86de09f7-6d8f-4431-90e9-61e99b19ec0e req-3fa34cdc-7ec1-47ae-82ec-48e97d562263 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Updating instance_info_cache with network_info: [{"id": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "address": "fa:16:3e:1f:63:df", "network": {"id": "669651c7-5ac9-4d2e-a8a1-366ce4dcd584", "bridge": "br-int", "label": "tempest-network-smoke--1906160902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32811792-c3", "ovs_interfaceid": "32811792-c388-4fbf-ae6d-43c1e4b213bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:29:55 compute-0 nova_compute[186479]: 2026-02-17 17:29:55.948 186483 DEBUG oslo_concurrency.lockutils [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:55 compute-0 nova_compute[186479]: 2026-02-17 17:29:55.949 186483 DEBUG oslo_concurrency.lockutils [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:55 compute-0 nova_compute[186479]: 2026-02-17 17:29:55.983 186483 DEBUG oslo_concurrency.lockutils [req-86de09f7-6d8f-4431-90e9-61e99b19ec0e req-3fa34cdc-7ec1-47ae-82ec-48e97d562263 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:29:56 compute-0 nova_compute[186479]: 2026-02-17 17:29:56.013 186483 DEBUG nova.compute.provider_tree [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:29:56 compute-0 nova_compute[186479]: 2026-02-17 17:29:56.030 186483 DEBUG nova.scheduler.client.report [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:29:56 compute-0 nova_compute[186479]: 2026-02-17 17:29:56.056 186483 DEBUG oslo_concurrency.lockutils [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:56 compute-0 nova_compute[186479]: 2026-02-17 17:29:56.083 186483 INFO nova.scheduler.client.report [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Deleted allocations for instance 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa
Feb 17 17:29:56 compute-0 nova_compute[186479]: 2026-02-17 17:29:56.224 186483 DEBUG oslo_concurrency.lockutils [None req-eb9c5ed2-2953-4611-a741-a89d7566332c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:56 compute-0 nova_compute[186479]: 2026-02-17 17:29:56.564 186483 DEBUG nova.compute.manager [req-6c8610f6-19e2-4ce8-8320-cb145ae8386a req-3b2512ea-92e3-43cd-8a00-528a4e2ff703 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Received event network-vif-unplugged-32811792-c388-4fbf-ae6d-43c1e4b213bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:29:56 compute-0 nova_compute[186479]: 2026-02-17 17:29:56.564 186483 DEBUG oslo_concurrency.lockutils [req-6c8610f6-19e2-4ce8-8320-cb145ae8386a req-3b2512ea-92e3-43cd-8a00-528a4e2ff703 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:56 compute-0 nova_compute[186479]: 2026-02-17 17:29:56.564 186483 DEBUG oslo_concurrency.lockutils [req-6c8610f6-19e2-4ce8-8320-cb145ae8386a req-3b2512ea-92e3-43cd-8a00-528a4e2ff703 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:56 compute-0 nova_compute[186479]: 2026-02-17 17:29:56.565 186483 DEBUG oslo_concurrency.lockutils [req-6c8610f6-19e2-4ce8-8320-cb145ae8386a req-3b2512ea-92e3-43cd-8a00-528a4e2ff703 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:56 compute-0 nova_compute[186479]: 2026-02-17 17:29:56.565 186483 DEBUG nova.compute.manager [req-6c8610f6-19e2-4ce8-8320-cb145ae8386a req-3b2512ea-92e3-43cd-8a00-528a4e2ff703 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] No waiting events found dispatching network-vif-unplugged-32811792-c388-4fbf-ae6d-43c1e4b213bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:29:56 compute-0 nova_compute[186479]: 2026-02-17 17:29:56.565 186483 WARNING nova.compute.manager [req-6c8610f6-19e2-4ce8-8320-cb145ae8386a req-3b2512ea-92e3-43cd-8a00-528a4e2ff703 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Received unexpected event network-vif-unplugged-32811792-c388-4fbf-ae6d-43c1e4b213bf for instance with vm_state deleted and task_state None.
Feb 17 17:29:56 compute-0 nova_compute[186479]: 2026-02-17 17:29:56.565 186483 DEBUG nova.compute.manager [req-6c8610f6-19e2-4ce8-8320-cb145ae8386a req-3b2512ea-92e3-43cd-8a00-528a4e2ff703 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Received event network-vif-plugged-32811792-c388-4fbf-ae6d-43c1e4b213bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:29:56 compute-0 nova_compute[186479]: 2026-02-17 17:29:56.566 186483 DEBUG oslo_concurrency.lockutils [req-6c8610f6-19e2-4ce8-8320-cb145ae8386a req-3b2512ea-92e3-43cd-8a00-528a4e2ff703 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:56 compute-0 nova_compute[186479]: 2026-02-17 17:29:56.566 186483 DEBUG oslo_concurrency.lockutils [req-6c8610f6-19e2-4ce8-8320-cb145ae8386a req-3b2512ea-92e3-43cd-8a00-528a4e2ff703 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:56 compute-0 nova_compute[186479]: 2026-02-17 17:29:56.566 186483 DEBUG oslo_concurrency.lockutils [req-6c8610f6-19e2-4ce8-8320-cb145ae8386a req-3b2512ea-92e3-43cd-8a00-528a4e2ff703 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:56 compute-0 nova_compute[186479]: 2026-02-17 17:29:56.566 186483 DEBUG nova.compute.manager [req-6c8610f6-19e2-4ce8-8320-cb145ae8386a req-3b2512ea-92e3-43cd-8a00-528a4e2ff703 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] No waiting events found dispatching network-vif-plugged-32811792-c388-4fbf-ae6d-43c1e4b213bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:29:56 compute-0 nova_compute[186479]: 2026-02-17 17:29:56.566 186483 WARNING nova.compute.manager [req-6c8610f6-19e2-4ce8-8320-cb145ae8386a req-3b2512ea-92e3-43cd-8a00-528a4e2ff703 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Received unexpected event network-vif-plugged-32811792-c388-4fbf-ae6d-43c1e4b213bf for instance with vm_state deleted and task_state None.
Feb 17 17:29:56 compute-0 nova_compute[186479]: 2026-02-17 17:29:56.566 186483 DEBUG nova.compute.manager [req-6c8610f6-19e2-4ce8-8320-cb145ae8386a req-3b2512ea-92e3-43cd-8a00-528a4e2ff703 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Received event network-vif-deleted-32811792-c388-4fbf-ae6d-43c1e4b213bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:29:56 compute-0 nova_compute[186479]: 2026-02-17 17:29:56.567 186483 INFO nova.compute.manager [req-6c8610f6-19e2-4ce8-8320-cb145ae8386a req-3b2512ea-92e3-43cd-8a00-528a4e2ff703 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Neutron deleted interface 32811792-c388-4fbf-ae6d-43c1e4b213bf; detaching it from the instance and deleting it from the info cache
Feb 17 17:29:56 compute-0 nova_compute[186479]: 2026-02-17 17:29:56.567 186483 DEBUG nova.network.neutron [req-6c8610f6-19e2-4ce8-8320-cb145ae8386a req-3b2512ea-92e3-43cd-8a00-528a4e2ff703 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Feb 17 17:29:56 compute-0 nova_compute[186479]: 2026-02-17 17:29:56.569 186483 DEBUG nova.compute.manager [req-6c8610f6-19e2-4ce8-8320-cb145ae8386a req-3b2512ea-92e3-43cd-8a00-528a4e2ff703 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Detach interface failed, port_id=32811792-c388-4fbf-ae6d-43c1e4b213bf, reason: Instance 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Feb 17 17:29:57 compute-0 nova_compute[186479]: 2026-02-17 17:29:57.027 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:29:57 compute-0 nova_compute[186479]: 2026-02-17 17:29:57.028 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:29:57 compute-0 nova_compute[186479]: 2026-02-17 17:29:57.028 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:29:57 compute-0 nova_compute[186479]: 2026-02-17 17:29:57.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:29:57 compute-0 nova_compute[186479]: 2026-02-17 17:29:57.305 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:29:58 compute-0 nova_compute[186479]: 2026-02-17 17:29:58.305 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:29:58 compute-0 nova_compute[186479]: 2026-02-17 17:29:58.306 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:29:58 compute-0 nova_compute[186479]: 2026-02-17 17:29:58.331 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:58 compute-0 nova_compute[186479]: 2026-02-17 17:29:58.332 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:58 compute-0 nova_compute[186479]: 2026-02-17 17:29:58.332 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:58 compute-0 nova_compute[186479]: 2026-02-17 17:29:58.332 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:29:58 compute-0 nova_compute[186479]: 2026-02-17 17:29:58.436 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:29:58 compute-0 nova_compute[186479]: 2026-02-17 17:29:58.480 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:29:58 compute-0 nova_compute[186479]: 2026-02-17 17:29:58.481 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5774MB free_disk=73.21108627319336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:29:58 compute-0 nova_compute[186479]: 2026-02-17 17:29:58.481 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:29:58 compute-0 nova_compute[186479]: 2026-02-17 17:29:58.482 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:29:58 compute-0 nova_compute[186479]: 2026-02-17 17:29:58.528 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:29:58 compute-0 nova_compute[186479]: 2026-02-17 17:29:58.528 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:29:58 compute-0 nova_compute[186479]: 2026-02-17 17:29:58.569 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:29:58 compute-0 nova_compute[186479]: 2026-02-17 17:29:58.581 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:29:58 compute-0 nova_compute[186479]: 2026-02-17 17:29:58.601 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:29:58 compute-0 nova_compute[186479]: 2026-02-17 17:29:58.602 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:29:59 compute-0 nova_compute[186479]: 2026-02-17 17:29:59.594 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:29:59 compute-0 nova_compute[186479]: 2026-02-17 17:29:59.868 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:00 compute-0 nova_compute[186479]: 2026-02-17 17:30:00.298 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:30:00 compute-0 nova_compute[186479]: 2026-02-17 17:30:00.318 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:30:00 compute-0 nova_compute[186479]: 2026-02-17 17:30:00.318 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:30:00 compute-0 nova_compute[186479]: 2026-02-17 17:30:00.318 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 17 17:30:00 compute-0 nova_compute[186479]: 2026-02-17 17:30:00.331 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 17 17:30:00 compute-0 nova_compute[186479]: 2026-02-17 17:30:00.331 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:30:01 compute-0 nova_compute[186479]: 2026-02-17 17:30:01.701 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:01 compute-0 nova_compute[186479]: 2026-02-17 17:30:01.719 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:02 compute-0 podman[215933]: 2026-02-17 17:30:02.801199826 +0000 UTC m=+0.134363191 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 17 17:30:03 compute-0 nova_compute[186479]: 2026-02-17 17:30:03.440 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:04 compute-0 nova_compute[186479]: 2026-02-17 17:30:04.871 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:05 compute-0 nova_compute[186479]: 2026-02-17 17:30:05.035 186483 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771349390.0325494, ee0a4091-de19-47a0-b8ea-04c5e762e3f0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:30:05 compute-0 nova_compute[186479]: 2026-02-17 17:30:05.036 186483 INFO nova.compute.manager [-] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] VM Stopped (Lifecycle Event)
Feb 17 17:30:05 compute-0 nova_compute[186479]: 2026-02-17 17:30:05.054 186483 DEBUG nova.compute.manager [None req-5243327b-4404-4623-8de8-8f6613445bc0 - - - - - -] [instance: ee0a4091-de19-47a0-b8ea-04c5e762e3f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:30:05 compute-0 podman[215960]: 2026-02-17 17:30:05.724889046 +0000 UTC m=+0.064110092 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 17 17:30:08 compute-0 nova_compute[186479]: 2026-02-17 17:30:08.442 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:09 compute-0 nova_compute[186479]: 2026-02-17 17:30:09.843 186483 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771349394.8430245, 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:30:09 compute-0 nova_compute[186479]: 2026-02-17 17:30:09.844 186483 INFO nova.compute.manager [-] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] VM Stopped (Lifecycle Event)
Feb 17 17:30:09 compute-0 nova_compute[186479]: 2026-02-17 17:30:09.863 186483 DEBUG nova.compute.manager [None req-83d3f4e6-e959-4c8d-99c9-6314f499cd4b - - - - - -] [instance: 8d09d4bb-e7e8-4a39-b2b3-c99271d7d5aa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:30:09 compute-0 nova_compute[186479]: 2026-02-17 17:30:09.874 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:10.947 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:30:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:10.948 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:30:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:10.948 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:30:11 compute-0 podman[215987]: 2026-02-17 17:30:11.732566662 +0000 UTC m=+0.074480592 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, release=1770267347, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Feb 17 17:30:13 compute-0 nova_compute[186479]: 2026-02-17 17:30:13.444 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:14 compute-0 nova_compute[186479]: 2026-02-17 17:30:14.876 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:17 compute-0 sshd-session[216010]: Connection closed by authenticating user root 35.200.201.144 port 47164 [preauth]
Feb 17 17:30:17 compute-0 nova_compute[186479]: 2026-02-17 17:30:17.600 186483 DEBUG oslo_concurrency.lockutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "3ad24cf9-8612-4439-b021-bda5f2bddb24" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:30:17 compute-0 nova_compute[186479]: 2026-02-17 17:30:17.600 186483 DEBUG oslo_concurrency.lockutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:30:17 compute-0 nova_compute[186479]: 2026-02-17 17:30:17.618 186483 DEBUG nova.compute.manager [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 17 17:30:17 compute-0 nova_compute[186479]: 2026-02-17 17:30:17.685 186483 DEBUG oslo_concurrency.lockutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:30:17 compute-0 nova_compute[186479]: 2026-02-17 17:30:17.686 186483 DEBUG oslo_concurrency.lockutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:30:17 compute-0 nova_compute[186479]: 2026-02-17 17:30:17.694 186483 DEBUG nova.virt.hardware [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 17 17:30:17 compute-0 nova_compute[186479]: 2026-02-17 17:30:17.694 186483 INFO nova.compute.claims [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Claim successful on node compute-0.ctlplane.example.com
Feb 17 17:30:17 compute-0 nova_compute[186479]: 2026-02-17 17:30:17.801 186483 DEBUG nova.compute.provider_tree [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:30:17 compute-0 nova_compute[186479]: 2026-02-17 17:30:17.815 186483 DEBUG nova.scheduler.client.report [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:30:17 compute-0 nova_compute[186479]: 2026-02-17 17:30:17.833 186483 DEBUG oslo_concurrency.lockutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:30:17 compute-0 nova_compute[186479]: 2026-02-17 17:30:17.834 186483 DEBUG nova.compute.manager [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 17 17:30:17 compute-0 nova_compute[186479]: 2026-02-17 17:30:17.972 186483 DEBUG nova.compute.manager [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 17 17:30:17 compute-0 nova_compute[186479]: 2026-02-17 17:30:17.973 186483 DEBUG nova.network.neutron [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 17 17:30:17 compute-0 nova_compute[186479]: 2026-02-17 17:30:17.995 186483 INFO nova.virt.libvirt.driver [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.023 186483 DEBUG nova.compute.manager [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.113 186483 DEBUG nova.compute.manager [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.114 186483 DEBUG nova.virt.libvirt.driver [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.114 186483 INFO nova.virt.libvirt.driver [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Creating image(s)
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.115 186483 DEBUG oslo_concurrency.lockutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "/var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.115 186483 DEBUG oslo_concurrency.lockutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.116 186483 DEBUG oslo_concurrency.lockutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.128 186483 DEBUG oslo_concurrency.processutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.184 186483 DEBUG oslo_concurrency.processutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.185 186483 DEBUG oslo_concurrency.lockutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.186 186483 DEBUG oslo_concurrency.lockutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.201 186483 DEBUG oslo_concurrency.processutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.247 186483 DEBUG oslo_concurrency.processutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.251 186483 DEBUG oslo_concurrency.processutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.280 186483 DEBUG oslo_concurrency.processutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.282 186483 DEBUG oslo_concurrency.lockutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.282 186483 DEBUG oslo_concurrency.processutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.306 186483 DEBUG nova.policy [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.331 186483 DEBUG oslo_concurrency.processutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.331 186483 DEBUG nova.virt.disk.api [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Checking if we can resize image /var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.332 186483 DEBUG oslo_concurrency.processutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.379 186483 DEBUG oslo_concurrency.processutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.380 186483 DEBUG nova.virt.disk.api [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Cannot resize image /var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.380 186483 DEBUG nova.objects.instance [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'migration_context' on Instance uuid 3ad24cf9-8612-4439-b021-bda5f2bddb24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.391 186483 DEBUG nova.virt.libvirt.driver [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.392 186483 DEBUG nova.virt.libvirt.driver [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Ensure instance console log exists: /var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.392 186483 DEBUG oslo_concurrency.lockutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.393 186483 DEBUG oslo_concurrency.lockutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.393 186483 DEBUG oslo_concurrency.lockutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:30:18 compute-0 nova_compute[186479]: 2026-02-17 17:30:18.445 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:19 compute-0 nova_compute[186479]: 2026-02-17 17:30:19.303 186483 DEBUG nova.network.neutron [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Successfully created port: 69143768-441a-4b58-8b86-b127b5cb10ba _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 17 17:30:19 compute-0 nova_compute[186479]: 2026-02-17 17:30:19.879 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:20 compute-0 nova_compute[186479]: 2026-02-17 17:30:20.017 186483 DEBUG nova.network.neutron [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Successfully updated port: 69143768-441a-4b58-8b86-b127b5cb10ba _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 17 17:30:20 compute-0 nova_compute[186479]: 2026-02-17 17:30:20.030 186483 DEBUG oslo_concurrency.lockutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "refresh_cache-3ad24cf9-8612-4439-b021-bda5f2bddb24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:30:20 compute-0 nova_compute[186479]: 2026-02-17 17:30:20.030 186483 DEBUG oslo_concurrency.lockutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquired lock "refresh_cache-3ad24cf9-8612-4439-b021-bda5f2bddb24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:30:20 compute-0 nova_compute[186479]: 2026-02-17 17:30:20.030 186483 DEBUG nova.network.neutron [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 17 17:30:20 compute-0 nova_compute[186479]: 2026-02-17 17:30:20.168 186483 DEBUG nova.compute.manager [req-d3322a95-97a1-4610-bceb-0ea80692ca62 req-b8916b28-e2b5-46f4-8af3-635415d915e3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Received event network-changed-69143768-441a-4b58-8b86-b127b5cb10ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:30:20 compute-0 nova_compute[186479]: 2026-02-17 17:30:20.168 186483 DEBUG nova.compute.manager [req-d3322a95-97a1-4610-bceb-0ea80692ca62 req-b8916b28-e2b5-46f4-8af3-635415d915e3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Refreshing instance network info cache due to event network-changed-69143768-441a-4b58-8b86-b127b5cb10ba. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:30:20 compute-0 nova_compute[186479]: 2026-02-17 17:30:20.168 186483 DEBUG oslo_concurrency.lockutils [req-d3322a95-97a1-4610-bceb-0ea80692ca62 req-b8916b28-e2b5-46f4-8af3-635415d915e3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-3ad24cf9-8612-4439-b021-bda5f2bddb24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:30:20 compute-0 podman[216027]: 2026-02-17 17:30:20.705743033 +0000 UTC m=+0.044778858 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 17 17:30:20 compute-0 podman[216028]: 2026-02-17 17:30:20.724387191 +0000 UTC m=+0.059008670 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.299 186483 DEBUG nova.network.neutron [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.878 186483 DEBUG nova.network.neutron [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Updating instance_info_cache with network_info: [{"id": "69143768-441a-4b58-8b86-b127b5cb10ba", "address": "fa:16:3e:c1:bc:72", "network": {"id": "4b74eb23-d2ba-4cd7-803b-057cc56db5a9", "bridge": "br-int", "label": "tempest-network-smoke--2145510902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69143768-44", "ovs_interfaceid": "69143768-441a-4b58-8b86-b127b5cb10ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.903 186483 DEBUG oslo_concurrency.lockutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Releasing lock "refresh_cache-3ad24cf9-8612-4439-b021-bda5f2bddb24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.904 186483 DEBUG nova.compute.manager [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Instance network_info: |[{"id": "69143768-441a-4b58-8b86-b127b5cb10ba", "address": "fa:16:3e:c1:bc:72", "network": {"id": "4b74eb23-d2ba-4cd7-803b-057cc56db5a9", "bridge": "br-int", "label": "tempest-network-smoke--2145510902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69143768-44", "ovs_interfaceid": "69143768-441a-4b58-8b86-b127b5cb10ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.904 186483 DEBUG oslo_concurrency.lockutils [req-d3322a95-97a1-4610-bceb-0ea80692ca62 req-b8916b28-e2b5-46f4-8af3-635415d915e3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-3ad24cf9-8612-4439-b021-bda5f2bddb24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.905 186483 DEBUG nova.network.neutron [req-d3322a95-97a1-4610-bceb-0ea80692ca62 req-b8916b28-e2b5-46f4-8af3-635415d915e3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Refreshing network info cache for port 69143768-441a-4b58-8b86-b127b5cb10ba _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.908 186483 DEBUG nova.virt.libvirt.driver [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Start _get_guest_xml network_info=[{"id": "69143768-441a-4b58-8b86-b127b5cb10ba", "address": "fa:16:3e:c1:bc:72", "network": {"id": "4b74eb23-d2ba-4cd7-803b-057cc56db5a9", "bridge": "br-int", "label": "tempest-network-smoke--2145510902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69143768-44", "ovs_interfaceid": "69143768-441a-4b58-8b86-b127b5cb10ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_format': None, 'encryption_options': None, 'size': 0, 'encrypted': False, 'image_id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.913 186483 WARNING nova.virt.libvirt.driver [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.918 186483 DEBUG nova.virt.libvirt.host [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.919 186483 DEBUG nova.virt.libvirt.host [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.928 186483 DEBUG nova.virt.libvirt.host [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.929 186483 DEBUG nova.virt.libvirt.host [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.930 186483 DEBUG nova.virt.libvirt.driver [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.930 186483 DEBUG nova.virt.hardware [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-17T17:28:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='5ebd41ec-8360-4181-bce3-0c0dc586cdb2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.931 186483 DEBUG nova.virt.hardware [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.932 186483 DEBUG nova.virt.hardware [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.932 186483 DEBUG nova.virt.hardware [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.933 186483 DEBUG nova.virt.hardware [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.933 186483 DEBUG nova.virt.hardware [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.933 186483 DEBUG nova.virt.hardware [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.934 186483 DEBUG nova.virt.hardware [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.934 186483 DEBUG nova.virt.hardware [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.935 186483 DEBUG nova.virt.hardware [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.935 186483 DEBUG nova.virt.hardware [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.940 186483 DEBUG nova.virt.libvirt.vif [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:30:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2040077989',display_name='tempest-TestNetworkBasicOps-server-2040077989',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2040077989',id=3,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKeVBbhOS69U2o/qdWVjQc/kA4LynViBOxubp8Mv3krMdHX8NlEfC9nO77wzBe+GMUKbYniP3l1YAkyRYnocZJo5PlPjUwprrqnBAmIy27Oc4hLoupP/GAxrvI7a1aFqGA==',key_name='tempest-TestNetworkBasicOps-971265848',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-j0cyv0oy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:30:18Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=3ad24cf9-8612-4439-b021-bda5f2bddb24,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "69143768-441a-4b58-8b86-b127b5cb10ba", "address": "fa:16:3e:c1:bc:72", "network": {"id": "4b74eb23-d2ba-4cd7-803b-057cc56db5a9", "bridge": "br-int", "label": "tempest-network-smoke--2145510902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69143768-44", "ovs_interfaceid": "69143768-441a-4b58-8b86-b127b5cb10ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.941 186483 DEBUG nova.network.os_vif_util [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "69143768-441a-4b58-8b86-b127b5cb10ba", "address": "fa:16:3e:c1:bc:72", "network": {"id": "4b74eb23-d2ba-4cd7-803b-057cc56db5a9", "bridge": "br-int", "label": "tempest-network-smoke--2145510902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69143768-44", "ovs_interfaceid": "69143768-441a-4b58-8b86-b127b5cb10ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.942 186483 DEBUG nova.network.os_vif_util [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:bc:72,bridge_name='br-int',has_traffic_filtering=True,id=69143768-441a-4b58-8b86-b127b5cb10ba,network=Network(4b74eb23-d2ba-4cd7-803b-057cc56db5a9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69143768-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.943 186483 DEBUG nova.objects.instance [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'pci_devices' on Instance uuid 3ad24cf9-8612-4439-b021-bda5f2bddb24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.959 186483 DEBUG nova.virt.libvirt.driver [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] End _get_guest_xml xml=<domain type="kvm">
Feb 17 17:30:21 compute-0 nova_compute[186479]:   <uuid>3ad24cf9-8612-4439-b021-bda5f2bddb24</uuid>
Feb 17 17:30:21 compute-0 nova_compute[186479]:   <name>instance-00000003</name>
Feb 17 17:30:21 compute-0 nova_compute[186479]:   <memory>131072</memory>
Feb 17 17:30:21 compute-0 nova_compute[186479]:   <vcpu>1</vcpu>
Feb 17 17:30:21 compute-0 nova_compute[186479]:   <metadata>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <nova:name>tempest-TestNetworkBasicOps-server-2040077989</nova:name>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <nova:creationTime>2026-02-17 17:30:21</nova:creationTime>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <nova:flavor name="m1.nano">
Feb 17 17:30:21 compute-0 nova_compute[186479]:         <nova:memory>128</nova:memory>
Feb 17 17:30:21 compute-0 nova_compute[186479]:         <nova:disk>1</nova:disk>
Feb 17 17:30:21 compute-0 nova_compute[186479]:         <nova:swap>0</nova:swap>
Feb 17 17:30:21 compute-0 nova_compute[186479]:         <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:30:21 compute-0 nova_compute[186479]:         <nova:vcpus>1</nova:vcpus>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       </nova:flavor>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <nova:owner>
Feb 17 17:30:21 compute-0 nova_compute[186479]:         <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:30:21 compute-0 nova_compute[186479]:         <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       </nova:owner>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <nova:ports>
Feb 17 17:30:21 compute-0 nova_compute[186479]:         <nova:port uuid="69143768-441a-4b58-8b86-b127b5cb10ba">
Feb 17 17:30:21 compute-0 nova_compute[186479]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:         </nova:port>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       </nova:ports>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     </nova:instance>
Feb 17 17:30:21 compute-0 nova_compute[186479]:   </metadata>
Feb 17 17:30:21 compute-0 nova_compute[186479]:   <sysinfo type="smbios">
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <system>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <entry name="manufacturer">RDO</entry>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <entry name="product">OpenStack Compute</entry>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <entry name="serial">3ad24cf9-8612-4439-b021-bda5f2bddb24</entry>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <entry name="uuid">3ad24cf9-8612-4439-b021-bda5f2bddb24</entry>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <entry name="family">Virtual Machine</entry>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     </system>
Feb 17 17:30:21 compute-0 nova_compute[186479]:   </sysinfo>
Feb 17 17:30:21 compute-0 nova_compute[186479]:   <os>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <boot dev="hd"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <smbios mode="sysinfo"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:   </os>
Feb 17 17:30:21 compute-0 nova_compute[186479]:   <features>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <acpi/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <apic/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <vmcoreinfo/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:   </features>
Feb 17 17:30:21 compute-0 nova_compute[186479]:   <clock offset="utc">
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <timer name="pit" tickpolicy="delay"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <timer name="hpet" present="no"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:   </clock>
Feb 17 17:30:21 compute-0 nova_compute[186479]:   <cpu mode="host-model" match="exact">
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <topology sockets="1" cores="1" threads="1"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:30:21 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <disk type="file" device="disk">
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/disk"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <target dev="vda" bus="virtio"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <disk type="file" device="cdrom">
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <driver name="qemu" type="raw" cache="none"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.config"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <target dev="sda" bus="sata"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <interface type="ethernet">
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <mac address="fa:16:3e:c1:bc:72"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <driver name="vhost" rx_queue_size="512"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <mtu size="1442"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <target dev="tap69143768-44"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <serial type="pty">
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <log file="/var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/console.log" append="off"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     </serial>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <video>
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     </video>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <input type="tablet" bus="usb"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <rng model="virtio">
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <backend model="random">/dev/urandom</backend>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <controller type="usb" index="0"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     <memballoon model="virtio">
Feb 17 17:30:21 compute-0 nova_compute[186479]:       <stats period="10"/>
Feb 17 17:30:21 compute-0 nova_compute[186479]:     </memballoon>
Feb 17 17:30:21 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:30:21 compute-0 nova_compute[186479]: </domain>
Feb 17 17:30:21 compute-0 nova_compute[186479]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.960 186483 DEBUG nova.compute.manager [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Preparing to wait for external event network-vif-plugged-69143768-441a-4b58-8b86-b127b5cb10ba prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.960 186483 DEBUG oslo_concurrency.lockutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.961 186483 DEBUG oslo_concurrency.lockutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.961 186483 DEBUG oslo_concurrency.lockutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.962 186483 DEBUG nova.virt.libvirt.vif [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:30:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2040077989',display_name='tempest-TestNetworkBasicOps-server-2040077989',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2040077989',id=3,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKeVBbhOS69U2o/qdWVjQc/kA4LynViBOxubp8Mv3krMdHX8NlEfC9nO77wzBe+GMUKbYniP3l1YAkyRYnocZJo5PlPjUwprrqnBAmIy27Oc4hLoupP/GAxrvI7a1aFqGA==',key_name='tempest-TestNetworkBasicOps-971265848',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-j0cyv0oy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:30:18Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=3ad24cf9-8612-4439-b021-bda5f2bddb24,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "69143768-441a-4b58-8b86-b127b5cb10ba", "address": "fa:16:3e:c1:bc:72", "network": {"id": "4b74eb23-d2ba-4cd7-803b-057cc56db5a9", "bridge": "br-int", "label": "tempest-network-smoke--2145510902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69143768-44", "ovs_interfaceid": "69143768-441a-4b58-8b86-b127b5cb10ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.962 186483 DEBUG nova.network.os_vif_util [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "69143768-441a-4b58-8b86-b127b5cb10ba", "address": "fa:16:3e:c1:bc:72", "network": {"id": "4b74eb23-d2ba-4cd7-803b-057cc56db5a9", "bridge": "br-int", "label": "tempest-network-smoke--2145510902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69143768-44", "ovs_interfaceid": "69143768-441a-4b58-8b86-b127b5cb10ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.963 186483 DEBUG nova.network.os_vif_util [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:bc:72,bridge_name='br-int',has_traffic_filtering=True,id=69143768-441a-4b58-8b86-b127b5cb10ba,network=Network(4b74eb23-d2ba-4cd7-803b-057cc56db5a9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69143768-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.964 186483 DEBUG os_vif [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:bc:72,bridge_name='br-int',has_traffic_filtering=True,id=69143768-441a-4b58-8b86-b127b5cb10ba,network=Network(4b74eb23-d2ba-4cd7-803b-057cc56db5a9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69143768-44') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.965 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.965 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.966 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.970 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.970 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69143768-44, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.971 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap69143768-44, col_values=(('external_ids', {'iface-id': '69143768-441a-4b58-8b86-b127b5cb10ba', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c1:bc:72', 'vm-uuid': '3ad24cf9-8612-4439-b021-bda5f2bddb24'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:30:21 compute-0 NetworkManager[56323]: <info>  [1771349421.9742] manager: (tap69143768-44): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.976 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.980 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:21 compute-0 nova_compute[186479]: 2026-02-17 17:30:21.982 186483 INFO os_vif [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:bc:72,bridge_name='br-int',has_traffic_filtering=True,id=69143768-441a-4b58-8b86-b127b5cb10ba,network=Network(4b74eb23-d2ba-4cd7-803b-057cc56db5a9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69143768-44')
Feb 17 17:30:22 compute-0 nova_compute[186479]: 2026-02-17 17:30:22.021 186483 DEBUG nova.virt.libvirt.driver [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:30:22 compute-0 nova_compute[186479]: 2026-02-17 17:30:22.022 186483 DEBUG nova.virt.libvirt.driver [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:30:22 compute-0 nova_compute[186479]: 2026-02-17 17:30:22.022 186483 DEBUG nova.virt.libvirt.driver [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No VIF found with MAC fa:16:3e:c1:bc:72, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 17 17:30:22 compute-0 nova_compute[186479]: 2026-02-17 17:30:22.022 186483 INFO nova.virt.libvirt.driver [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Using config drive
Feb 17 17:30:22 compute-0 nova_compute[186479]: 2026-02-17 17:30:22.298 186483 INFO nova.virt.libvirt.driver [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Creating config drive at /var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.config
Feb 17 17:30:22 compute-0 nova_compute[186479]: 2026-02-17 17:30:22.302 186483 DEBUG oslo_concurrency.processutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpcbviymcq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:30:22 compute-0 nova_compute[186479]: 2026-02-17 17:30:22.424 186483 DEBUG oslo_concurrency.processutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpcbviymcq" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:30:22 compute-0 kernel: tap69143768-44: entered promiscuous mode
Feb 17 17:30:22 compute-0 NetworkManager[56323]: <info>  [1771349422.4723] manager: (tap69143768-44): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Feb 17 17:30:22 compute-0 ovn_controller[96568]: 2026-02-17T17:30:22Z|00046|binding|INFO|Claiming lport 69143768-441a-4b58-8b86-b127b5cb10ba for this chassis.
Feb 17 17:30:22 compute-0 ovn_controller[96568]: 2026-02-17T17:30:22Z|00047|binding|INFO|69143768-441a-4b58-8b86-b127b5cb10ba: Claiming fa:16:3e:c1:bc:72 10.100.0.12
Feb 17 17:30:22 compute-0 nova_compute[186479]: 2026-02-17 17:30:22.475 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:22 compute-0 nova_compute[186479]: 2026-02-17 17:30:22.479 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.488 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:bc:72 10.100.0.12'], port_security=['fa:16:3e:c1:bc:72 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b74eb23-d2ba-4cd7-803b-057cc56db5a9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a74e4f80-b041-475a-a4c7-2220e3eb2e06', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=24a37ac2-f3c1-40d9-9892-4196ddc52d6c, chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=69143768-441a-4b58-8b86-b127b5cb10ba) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.489 105898 INFO neutron.agent.ovn.metadata.agent [-] Port 69143768-441a-4b58-8b86-b127b5cb10ba in datapath 4b74eb23-d2ba-4cd7-803b-057cc56db5a9 bound to our chassis
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.490 105898 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b74eb23-d2ba-4cd7-803b-057cc56db5a9
Feb 17 17:30:22 compute-0 systemd-machined[155877]: New machine qemu-3-instance-00000003.
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.506 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[a9243bde-1d02-4b09-a9d0-560e909eb1cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.508 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4b74eb23-d1 in ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 17 17:30:22 compute-0 ovn_controller[96568]: 2026-02-17T17:30:22Z|00048|binding|INFO|Setting lport 69143768-441a-4b58-8b86-b127b5cb10ba ovn-installed in OVS
Feb 17 17:30:22 compute-0 ovn_controller[96568]: 2026-02-17T17:30:22Z|00049|binding|INFO|Setting lport 69143768-441a-4b58-8b86-b127b5cb10ba up in Southbound
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.510 215208 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4b74eb23-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.510 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[114d6541-de6d-4d49-b1c9-33be045f1e3d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:22 compute-0 nova_compute[186479]: 2026-02-17 17:30:22.510 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.512 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[fc295e5d-c0c4-4f6c-bf5a-4b9cdbd54aca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.526 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[6b73c92c-7908-4c5a-a036-6da305ed55ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:22 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000003.
Feb 17 17:30:22 compute-0 systemd-udevd[216087]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.541 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[f074691d-9afb-453e-9881-fc54912ed35a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:22 compute-0 NetworkManager[56323]: <info>  [1771349422.5483] device (tap69143768-44): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 17 17:30:22 compute-0 NetworkManager[56323]: <info>  [1771349422.5488] device (tap69143768-44): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.566 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[93d24b16-ddb8-4b08-bfe4-2e566e03b334]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.570 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[eae0436f-a8cc-4b41-b12a-c0fa7f854ca0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:22 compute-0 NetworkManager[56323]: <info>  [1771349422.5719] manager: (tap4b74eb23-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.600 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[3afa32cc-a3d9-4d9e-8594-f2c8c7c6bb4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.603 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[ed9c92db-92d1-4a0a-adbd-200ee55f1beb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:22 compute-0 NetworkManager[56323]: <info>  [1771349422.6206] device (tap4b74eb23-d0): carrier: link connected
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.624 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[bdef79cd-2f7d-4f8f-bcf2-e69dc927592c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.637 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[ec533520-4bd1-4d28-a7a5-84c64d8b4f1b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b74eb23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:e0:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 317793, 'reachable_time': 18378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216117, 'error': None, 'target': 'ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.651 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[4645106e-ea91-443b-b125-a061eeb4aef7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8d:e016'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 317793, 'tstamp': 317793}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216119, 'error': None, 'target': 'ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.665 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[c1615b44-fb69-4291-8d1a-66c2e299fa11]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b74eb23-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:e0:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 317793, 'reachable_time': 18378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216120, 'error': None, 'target': 'ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.690 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[ab133414-81e6-44a9-9a1d-3a69293d6c85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.732 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[298459ac-f0cf-4286-811a-3ae56173fa01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.734 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b74eb23-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.735 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.735 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b74eb23-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:30:22 compute-0 kernel: tap4b74eb23-d0: entered promiscuous mode
Feb 17 17:30:22 compute-0 NetworkManager[56323]: <info>  [1771349422.7378] manager: (tap4b74eb23-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Feb 17 17:30:22 compute-0 nova_compute[186479]: 2026-02-17 17:30:22.737 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.743 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b74eb23-d0, col_values=(('external_ids', {'iface-id': 'f26c418c-dcca-4701-be98-da0e994433df'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:30:22 compute-0 nova_compute[186479]: 2026-02-17 17:30:22.744 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:22 compute-0 ovn_controller[96568]: 2026-02-17T17:30:22Z|00050|binding|INFO|Releasing lport f26c418c-dcca-4701-be98-da0e994433df from this chassis (sb_readonly=0)
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.748 105898 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4b74eb23-d2ba-4cd7-803b-057cc56db5a9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4b74eb23-d2ba-4cd7-803b-057cc56db5a9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 17 17:30:22 compute-0 nova_compute[186479]: 2026-02-17 17:30:22.748 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.749 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[c1426574-6337-4968-b0a2-e49308dcb8b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.749 105898 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: global
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:     log         /dev/log local0 debug
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:     log-tag     haproxy-metadata-proxy-4b74eb23-d2ba-4cd7-803b-057cc56db5a9
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:     user        root
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:     group       root
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:     maxconn     1024
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:     pidfile     /var/lib/neutron/external/pids/4b74eb23-d2ba-4cd7-803b-057cc56db5a9.pid.haproxy
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:     daemon
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: defaults
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:     log global
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:     mode http
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:     option httplog
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:     option dontlognull
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:     option http-server-close
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:     option forwardfor
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:     retries                 3
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:     timeout http-request    30s
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:     timeout connect         30s
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:     timeout client          32s
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:     timeout server          32s
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:     timeout http-keep-alive 30s
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: listen listener
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:     bind 169.254.169.254:80
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:     server metadata /var/lib/neutron/metadata_proxy
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:     http-request add-header X-OVN-Network-ID 4b74eb23-d2ba-4cd7-803b-057cc56db5a9
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 17 17:30:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:22.750 105898 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9', 'env', 'PROCESS_TAG=haproxy-4b74eb23-d2ba-4cd7-803b-057cc56db5a9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4b74eb23-d2ba-4cd7-803b-057cc56db5a9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 17 17:30:22 compute-0 nova_compute[186479]: 2026-02-17 17:30:22.775 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349422.7742052, 3ad24cf9-8612-4439-b021-bda5f2bddb24 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:30:22 compute-0 nova_compute[186479]: 2026-02-17 17:30:22.775 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] VM Started (Lifecycle Event)
Feb 17 17:30:22 compute-0 nova_compute[186479]: 2026-02-17 17:30:22.798 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:30:22 compute-0 nova_compute[186479]: 2026-02-17 17:30:22.804 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349422.774548, 3ad24cf9-8612-4439-b021-bda5f2bddb24 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:30:22 compute-0 nova_compute[186479]: 2026-02-17 17:30:22.804 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] VM Paused (Lifecycle Event)
Feb 17 17:30:22 compute-0 nova_compute[186479]: 2026-02-17 17:30:22.820 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:30:22 compute-0 nova_compute[186479]: 2026-02-17 17:30:22.823 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:30:22 compute-0 nova_compute[186479]: 2026-02-17 17:30:22.848 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:30:23 compute-0 podman[216160]: 2026-02-17 17:30:23.04263588 +0000 UTC m=+0.051085938 container create b0ef569f2ba0f0e64b2af169b55f585645150c6dc9021aa373982dc657510f0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 17 17:30:23 compute-0 systemd[1]: Started libpod-conmon-b0ef569f2ba0f0e64b2af169b55f585645150c6dc9021aa373982dc657510f0e.scope.
Feb 17 17:30:23 compute-0 systemd[1]: Started libcrun container.
Feb 17 17:30:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ea28a5ff56d146f119a735a940acf054a14121a7c2a926cc49d5d76024385e5d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 17 17:30:23 compute-0 podman[216160]: 2026-02-17 17:30:23.015046838 +0000 UTC m=+0.023496916 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 17 17:30:23 compute-0 podman[216160]: 2026-02-17 17:30:23.119925739 +0000 UTC m=+0.128375857 container init b0ef569f2ba0f0e64b2af169b55f585645150c6dc9021aa373982dc657510f0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 17 17:30:23 compute-0 podman[216160]: 2026-02-17 17:30:23.124681343 +0000 UTC m=+0.133131431 container start b0ef569f2ba0f0e64b2af169b55f585645150c6dc9021aa373982dc657510f0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Feb 17 17:30:23 compute-0 neutron-haproxy-ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9[216175]: [NOTICE]   (216179) : New worker (216181) forked
Feb 17 17:30:23 compute-0 neutron-haproxy-ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9[216175]: [NOTICE]   (216179) : Loading success.
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.452 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.761 186483 DEBUG nova.compute.manager [req-55a36969-184c-4946-ac14-d179ae657047 req-f6894948-1022-471a-a9e9-cd3dbc0f56d6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Received event network-vif-plugged-69143768-441a-4b58-8b86-b127b5cb10ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.761 186483 DEBUG oslo_concurrency.lockutils [req-55a36969-184c-4946-ac14-d179ae657047 req-f6894948-1022-471a-a9e9-cd3dbc0f56d6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.762 186483 DEBUG oslo_concurrency.lockutils [req-55a36969-184c-4946-ac14-d179ae657047 req-f6894948-1022-471a-a9e9-cd3dbc0f56d6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.762 186483 DEBUG oslo_concurrency.lockutils [req-55a36969-184c-4946-ac14-d179ae657047 req-f6894948-1022-471a-a9e9-cd3dbc0f56d6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.762 186483 DEBUG nova.compute.manager [req-55a36969-184c-4946-ac14-d179ae657047 req-f6894948-1022-471a-a9e9-cd3dbc0f56d6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Processing event network-vif-plugged-69143768-441a-4b58-8b86-b127b5cb10ba _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.764 186483 DEBUG nova.compute.manager [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.769 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349423.769569, 3ad24cf9-8612-4439-b021-bda5f2bddb24 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.770 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] VM Resumed (Lifecycle Event)
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.773 186483 DEBUG nova.virt.libvirt.driver [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.777 186483 INFO nova.virt.libvirt.driver [-] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Instance spawned successfully.
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.778 186483 DEBUG nova.virt.libvirt.driver [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.804 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.812 186483 DEBUG nova.virt.libvirt.driver [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.812 186483 DEBUG nova.virt.libvirt.driver [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.813 186483 DEBUG nova.virt.libvirt.driver [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.813 186483 DEBUG nova.virt.libvirt.driver [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.813 186483 DEBUG nova.virt.libvirt.driver [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.814 186483 DEBUG nova.virt.libvirt.driver [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.818 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.872 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.887 186483 INFO nova.compute.manager [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Took 5.77 seconds to spawn the instance on the hypervisor.
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.887 186483 DEBUG nova.compute.manager [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.951 186483 INFO nova.compute.manager [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Took 6.29 seconds to build instance.
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.964 186483 DEBUG nova.network.neutron [req-d3322a95-97a1-4610-bceb-0ea80692ca62 req-b8916b28-e2b5-46f4-8af3-635415d915e3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Updated VIF entry in instance network info cache for port 69143768-441a-4b58-8b86-b127b5cb10ba. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.964 186483 DEBUG nova.network.neutron [req-d3322a95-97a1-4610-bceb-0ea80692ca62 req-b8916b28-e2b5-46f4-8af3-635415d915e3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Updating instance_info_cache with network_info: [{"id": "69143768-441a-4b58-8b86-b127b5cb10ba", "address": "fa:16:3e:c1:bc:72", "network": {"id": "4b74eb23-d2ba-4cd7-803b-057cc56db5a9", "bridge": "br-int", "label": "tempest-network-smoke--2145510902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69143768-44", "ovs_interfaceid": "69143768-441a-4b58-8b86-b127b5cb10ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.967 186483 DEBUG oslo_concurrency.lockutils [None req-67ee82de-1a43-4c34-823c-2a968fcd1823 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.367s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:30:23 compute-0 nova_compute[186479]: 2026-02-17 17:30:23.977 186483 DEBUG oslo_concurrency.lockutils [req-d3322a95-97a1-4610-bceb-0ea80692ca62 req-b8916b28-e2b5-46f4-8af3-635415d915e3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-3ad24cf9-8612-4439-b021-bda5f2bddb24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:30:24 compute-0 podman[216190]: 2026-02-17 17:30:24.731424741 +0000 UTC m=+0.063178600 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 17 17:30:25 compute-0 nova_compute[186479]: 2026-02-17 17:30:25.839 186483 DEBUG nova.compute.manager [req-3bedfca6-2bf4-4db8-80ac-d9e365a695bb req-5d071c59-80a6-4a40-8438-13722ed2a5cd 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Received event network-vif-plugged-69143768-441a-4b58-8b86-b127b5cb10ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:30:25 compute-0 nova_compute[186479]: 2026-02-17 17:30:25.840 186483 DEBUG oslo_concurrency.lockutils [req-3bedfca6-2bf4-4db8-80ac-d9e365a695bb req-5d071c59-80a6-4a40-8438-13722ed2a5cd 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:30:25 compute-0 nova_compute[186479]: 2026-02-17 17:30:25.840 186483 DEBUG oslo_concurrency.lockutils [req-3bedfca6-2bf4-4db8-80ac-d9e365a695bb req-5d071c59-80a6-4a40-8438-13722ed2a5cd 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:30:25 compute-0 nova_compute[186479]: 2026-02-17 17:30:25.841 186483 DEBUG oslo_concurrency.lockutils [req-3bedfca6-2bf4-4db8-80ac-d9e365a695bb req-5d071c59-80a6-4a40-8438-13722ed2a5cd 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:30:25 compute-0 nova_compute[186479]: 2026-02-17 17:30:25.841 186483 DEBUG nova.compute.manager [req-3bedfca6-2bf4-4db8-80ac-d9e365a695bb req-5d071c59-80a6-4a40-8438-13722ed2a5cd 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] No waiting events found dispatching network-vif-plugged-69143768-441a-4b58-8b86-b127b5cb10ba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:30:25 compute-0 nova_compute[186479]: 2026-02-17 17:30:25.842 186483 WARNING nova.compute.manager [req-3bedfca6-2bf4-4db8-80ac-d9e365a695bb req-5d071c59-80a6-4a40-8438-13722ed2a5cd 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Received unexpected event network-vif-plugged-69143768-441a-4b58-8b86-b127b5cb10ba for instance with vm_state active and task_state None.
Feb 17 17:30:27 compute-0 nova_compute[186479]: 2026-02-17 17:30:27.007 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:27 compute-0 ovn_controller[96568]: 2026-02-17T17:30:27Z|00051|binding|INFO|Releasing lport f26c418c-dcca-4701-be98-da0e994433df from this chassis (sb_readonly=0)
Feb 17 17:30:27 compute-0 NetworkManager[56323]: <info>  [1771349427.3013] manager: (patch-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Feb 17 17:30:27 compute-0 NetworkManager[56323]: <info>  [1771349427.3022] manager: (patch-br-int-to-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Feb 17 17:30:27 compute-0 nova_compute[186479]: 2026-02-17 17:30:27.303 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:27 compute-0 ovn_controller[96568]: 2026-02-17T17:30:27Z|00052|binding|INFO|Releasing lport f26c418c-dcca-4701-be98-da0e994433df from this chassis (sb_readonly=0)
Feb 17 17:30:27 compute-0 nova_compute[186479]: 2026-02-17 17:30:27.314 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:27 compute-0 nova_compute[186479]: 2026-02-17 17:30:27.905 186483 DEBUG nova.compute.manager [req-24ef9b5d-386f-48b0-931c-68b62cfb3336 req-5eeaa890-5443-4a77-ae22-92e6bf60704a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Received event network-changed-69143768-441a-4b58-8b86-b127b5cb10ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:30:27 compute-0 nova_compute[186479]: 2026-02-17 17:30:27.906 186483 DEBUG nova.compute.manager [req-24ef9b5d-386f-48b0-931c-68b62cfb3336 req-5eeaa890-5443-4a77-ae22-92e6bf60704a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Refreshing instance network info cache due to event network-changed-69143768-441a-4b58-8b86-b127b5cb10ba. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:30:27 compute-0 nova_compute[186479]: 2026-02-17 17:30:27.906 186483 DEBUG oslo_concurrency.lockutils [req-24ef9b5d-386f-48b0-931c-68b62cfb3336 req-5eeaa890-5443-4a77-ae22-92e6bf60704a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-3ad24cf9-8612-4439-b021-bda5f2bddb24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:30:27 compute-0 nova_compute[186479]: 2026-02-17 17:30:27.907 186483 DEBUG oslo_concurrency.lockutils [req-24ef9b5d-386f-48b0-931c-68b62cfb3336 req-5eeaa890-5443-4a77-ae22-92e6bf60704a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-3ad24cf9-8612-4439-b021-bda5f2bddb24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:30:27 compute-0 nova_compute[186479]: 2026-02-17 17:30:27.907 186483 DEBUG nova.network.neutron [req-24ef9b5d-386f-48b0-931c-68b62cfb3336 req-5eeaa890-5443-4a77-ae22-92e6bf60704a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Refreshing network info cache for port 69143768-441a-4b58-8b86-b127b5cb10ba _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:30:28 compute-0 nova_compute[186479]: 2026-02-17 17:30:28.450 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:28 compute-0 nova_compute[186479]: 2026-02-17 17:30:28.978 186483 DEBUG nova.network.neutron [req-24ef9b5d-386f-48b0-931c-68b62cfb3336 req-5eeaa890-5443-4a77-ae22-92e6bf60704a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Updated VIF entry in instance network info cache for port 69143768-441a-4b58-8b86-b127b5cb10ba. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:30:28 compute-0 nova_compute[186479]: 2026-02-17 17:30:28.978 186483 DEBUG nova.network.neutron [req-24ef9b5d-386f-48b0-931c-68b62cfb3336 req-5eeaa890-5443-4a77-ae22-92e6bf60704a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Updating instance_info_cache with network_info: [{"id": "69143768-441a-4b58-8b86-b127b5cb10ba", "address": "fa:16:3e:c1:bc:72", "network": {"id": "4b74eb23-d2ba-4cd7-803b-057cc56db5a9", "bridge": "br-int", "label": "tempest-network-smoke--2145510902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69143768-44", "ovs_interfaceid": "69143768-441a-4b58-8b86-b127b5cb10ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:30:29 compute-0 nova_compute[186479]: 2026-02-17 17:30:29.002 186483 DEBUG oslo_concurrency.lockutils [req-24ef9b5d-386f-48b0-931c-68b62cfb3336 req-5eeaa890-5443-4a77-ae22-92e6bf60704a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-3ad24cf9-8612-4439-b021-bda5f2bddb24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:30:32 compute-0 nova_compute[186479]: 2026-02-17 17:30:32.011 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:33 compute-0 nova_compute[186479]: 2026-02-17 17:30:33.452 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:33 compute-0 podman[216215]: 2026-02-17 17:30:33.761508986 +0000 UTC m=+0.095378329 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Feb 17 17:30:35 compute-0 ovn_controller[96568]: 2026-02-17T17:30:35Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c1:bc:72 10.100.0.12
Feb 17 17:30:35 compute-0 ovn_controller[96568]: 2026-02-17T17:30:35Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c1:bc:72 10.100.0.12
Feb 17 17:30:36 compute-0 podman[216259]: 2026-02-17 17:30:36.747833548 +0000 UTC m=+0.083936566 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 17 17:30:37 compute-0 nova_compute[186479]: 2026-02-17 17:30:37.013 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:38 compute-0 nova_compute[186479]: 2026-02-17 17:30:38.455 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:42 compute-0 nova_compute[186479]: 2026-02-17 17:30:42.006 186483 INFO nova.compute.manager [None req-6e8d223b-b90d-4885-bbb1-a073c822e3be 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Get console output
Feb 17 17:30:42 compute-0 nova_compute[186479]: 2026-02-17 17:30:42.011 215112 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 17 17:30:42 compute-0 nova_compute[186479]: 2026-02-17 17:30:42.017 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:42 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:42.453 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '7a:bf:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:a1:c8:7d:f2:63'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:30:42 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:42.454 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 17 17:30:42 compute-0 nova_compute[186479]: 2026-02-17 17:30:42.454 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:42 compute-0 podman[216283]: 2026-02-17 17:30:42.729487034 +0000 UTC m=+0.069661023 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, release=1770267347, vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter)
Feb 17 17:30:43 compute-0 nova_compute[186479]: 2026-02-17 17:30:43.456 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.096 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}176b5ee517a1d69deea6da4ad2e89beee540ef4f38542800634bcf4cf13639b1" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.212 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Tue, 17 Feb 2026 17:30:44 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-dad6a337-b7e3-40e5-970c-f028b1c4c36e x-openstack-request-id: req-dad6a337-b7e3-40e5-970c-f028b1c4c36e _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.212 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "5ebd41ec-8360-4181-bce3-0c0dc586cdb2", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/5ebd41ec-8360-4181-bce3-0c0dc586cdb2"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/5ebd41ec-8360-4181-bce3-0c0dc586cdb2"}]}, {"id": "624094d4-f25f-447e-a21a-3f4396234ebb", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/624094d4-f25f-447e-a21a-3f4396234ebb"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/624094d4-f25f-447e-a21a-3f4396234ebb"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.213 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-dad6a337-b7e3-40e5-970c-f028b1c4c36e request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.215 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/5ebd41ec-8360-4181-bce3-0c0dc586cdb2 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}176b5ee517a1d69deea6da4ad2e89beee540ef4f38542800634bcf4cf13639b1" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.287 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Tue, 17 Feb 2026 17:30:44 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-bf847b4d-9bf0-473d-bba4-02a2ffa055c4 x-openstack-request-id: req-bf847b4d-9bf0-473d-bba4-02a2ffa055c4 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.287 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "5ebd41ec-8360-4181-bce3-0c0dc586cdb2", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/5ebd41ec-8360-4181-bce3-0c0dc586cdb2"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/5ebd41ec-8360-4181-bce3-0c0dc586cdb2"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.288 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/5ebd41ec-8360-4181-bce3-0c0dc586cdb2 used request id req-bf847b4d-9bf0-473d-bba4-02a2ffa055c4 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.289 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'name': 'tempest-TestNetworkBasicOps-server-2040077989', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'user_id': '3f041abe92134380b8de39091bce5989', 'hostId': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.290 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.323 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.device.write.bytes volume: 72937472 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.324 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7825f8da-4893-413d-a8b0-7664d472ebbd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72937472, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24-vda', 'timestamp': '2026-02-17T17:30:44.290485', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'instance-00000003', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '63867724-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.65569595, 'message_signature': 'da29dc2a05105032a9a3f3773a49865b2590c2d5cc070fc25863a4aaf8cb1f50'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24-sda', 'timestamp': '2026-02-17T17:30:44.290485', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'instance-00000003', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '638691d2-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.65569595, 'message_signature': 'd96cfcb65bfb797a2a9d599d0b73cbcdb8d4912d8c5b652641f2aa2e55e736e3'}]}, 'timestamp': '2026-02-17 17:30:44.325336', '_unique_id': '9298c1c798fd4478b2c205ea830c4d07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.332 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.337 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.337 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.device.read.latency volume: 498838641 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.338 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.device.read.latency volume: 41730147 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c986afe9-ff67-484d-95d0-645b7d45e0bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 498838641, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24-vda', 'timestamp': '2026-02-17T17:30:44.337248', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'instance-00000003', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '63887e8e-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.65569595, 'message_signature': 'd5ec56a7848a09f6c8d7077f797e6802c20e55f318a154d4df02d0105c49d305'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 41730147, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24-sda', 'timestamp': '2026-02-17T17:30:44.337248', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'instance-00000003', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '63889a9a-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.65569595, 'message_signature': '29675e59933f2f873e627d0cbc6f61b7e21718e04a78ac9bf0b2748a6e985279'}]}, 'timestamp': '2026-02-17 17:30:44.338643', '_unique_id': '753fe643b2c5467f8f06278353837487'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.339 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.340 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.343 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 3ad24cf9-8612-4439-b021-bda5f2bddb24 / tap69143768-44 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.343 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76eb4530-b060-4d9e-9f29-32350eae346c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-00000003-3ad24cf9-8612-4439-b021-bda5f2bddb24-tap69143768-44', 'timestamp': '2026-02-17T17:30:44.340359', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'tap69143768-44', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c1:bc:72', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69143768-44'}, 'message_id': '638977bc-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.705479495, 'message_signature': 'b6373136a5e38f9ae3d7097887eb8fe45471619b641b8e97b75c9614e68eb6f2'}]}, 'timestamp': '2026-02-17 17:30:44.344223', '_unique_id': 'b03221c424bf4f309ba00d018954787d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.345 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/network.outgoing.bytes volume: 3348 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '051214d0-b951-4c98-b161-901558b50437', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3348, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-00000003-3ad24cf9-8612-4439-b021-bda5f2bddb24-tap69143768-44', 'timestamp': '2026-02-17T17:30:44.345894', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'tap69143768-44', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c1:bc:72', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69143768-44'}, 'message_id': '6389c726-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.705479495, 'message_signature': 'bf929299faf19105b75d90944e4ef56466feb49fabd9adccf312c7d275ed615e'}]}, 'timestamp': '2026-02-17 17:30:44.346212', '_unique_id': '25d38bc9c5b1497ab31bf080fa33ca81'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.346 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.347 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.347 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.347 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e92be41-1236-4fe4-8d62-27c83adcc6d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24-vda', 'timestamp': '2026-02-17T17:30:44.347575', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'instance-00000003', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '638a08a8-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.65569595, 'message_signature': '57c6c4a0656878e7d282cd003ca354d317b0a9bb7d3a5181c60d11945bfa873b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24-sda', 'timestamp': '2026-02-17T17:30:44.347575', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'instance-00000003', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '638a133e-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.65569595, 'message_signature': 'fbb827d3be9fe0e2a390014c3fc3e5667a698d977727b17ef575960cddce0c87'}]}, 'timestamp': '2026-02-17 17:30:44.348146', '_unique_id': '1b1eda7288dd49a09057996022a5c16c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.348 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.349 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.349 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/network.incoming.packets volume: 29 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd1a2c34-17c1-470c-bf5f-d210c0659739', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 29, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-00000003-3ad24cf9-8612-4439-b021-bda5f2bddb24-tap69143768-44', 'timestamp': '2026-02-17T17:30:44.349515', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'tap69143768-44', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c1:bc:72', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69143768-44'}, 'message_id': '638a54b6-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.705479495, 'message_signature': 'a10e87cf2415c4cc70eb94ce798c9a35b86445d81f3d4b76d57a3e7edeca92b9'}]}, 'timestamp': '2026-02-17 17:30:44.349821', '_unique_id': '4ea7a5ea80f64717a00f007dc93791e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.350 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.351 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.362 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.362 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7a5e5e2-f252-4b1c-9b4d-e0eefe7a22a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24-vda', 'timestamp': '2026-02-17T17:30:44.351170', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'instance-00000003', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '638c5176-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.716296584, 'message_signature': 'ca2c234e7864b6735e4b21e38238c0a76b0b7f48fb9f11190881e79a428bdf39'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24-sda', 'timestamp': '2026-02-17T17:30:44.351170', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'instance-00000003', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '638c651c-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.716296584, 'message_signature': 'a067c2581f4308142ab9a8ff2e351e350253ef827295b39a1fcb929bb26fd318'}]}, 'timestamp': '2026-02-17 17:30:44.363398', '_unique_id': 'c35e5ab1c0b8410fb9d827f24d387aa3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.364 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.365 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.365 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36910d5c-7b18-4c5d-a70e-f2fc61a1110a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-00000003-3ad24cf9-8612-4439-b021-bda5f2bddb24-tap69143768-44', 'timestamp': '2026-02-17T17:30:44.365638', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'tap69143768-44', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c1:bc:72', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69143768-44'}, 'message_id': '638ccdfe-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.705479495, 'message_signature': 'bcfc52347f996ba4c0ee5e3a6172c395fb20f0546e0bead85b48c045b64a08a4'}]}, 'timestamp': '2026-02-17 17:30:44.366081', '_unique_id': '773f06d1ff3e44f8bdaac5f5283a03da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.366 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.367 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.367 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.367 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2040077989>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2040077989>]
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.368 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.368 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.device.write.latency volume: 2126764502 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.369 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc9f7380-756a-41b3-8080-6e5e0dcc539f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2126764502, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24-vda', 'timestamp': '2026-02-17T17:30:44.368460', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'instance-00000003', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '638d436a-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.65569595, 'message_signature': 'dab859ad4c7cafeb36b818e3f1295d672d16997240c80a6a52543d2157292dc4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24-sda', 'timestamp': '2026-02-17T17:30:44.368460', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'instance-00000003', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '638d5184-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.65569595, 'message_signature': '58c1a9728989d5245e7e220846c8053898af51932aa8c413b2ec0052f7e813fc'}]}, 'timestamp': '2026-02-17 17:30:44.369417', '_unique_id': '8d8bb9e605af44f5844f82ad415c096f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.370 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.371 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.371 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/network.incoming.bytes volume: 4585 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1ec6c0e-c106-454a-9283-3815221675f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4585, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-00000003-3ad24cf9-8612-4439-b021-bda5f2bddb24-tap69143768-44', 'timestamp': '2026-02-17T17:30:44.371166', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'tap69143768-44', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c1:bc:72', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69143768-44'}, 'message_id': '638da67a-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.705479495, 'message_signature': '9ca8bc1159483088215f183b5c93042c425f33f068de5289264607beaea491e2'}]}, 'timestamp': '2026-02-17 17:30:44.371581', '_unique_id': '74b9e68a906849008b9fcae673ca6ddd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.372 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.373 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.373 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54c99403-9a4f-4184-9e49-1e4b5ae39ccc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24-vda', 'timestamp': '2026-02-17T17:30:44.373098', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'instance-00000003', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '638df09e-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.716296584, 'message_signature': '933d4e9a8c11f24e5ea9697e880a6dbd7144304554573cda0bf3c000a7cd7bc4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24-sda', 'timestamp': '2026-02-17T17:30:44.373098', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'instance-00000003', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '638dfd64-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.716296584, 'message_signature': '11229da2a0e72806b13a23948b17ddfabc023ca8d9346b7916538e417dd749e1'}]}, 'timestamp': '2026-02-17 17:30:44.373790', '_unique_id': '2e60572ec09941b2ab2499d9979b9608'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.374 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.375 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.375 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.376 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a116a8fe-c80d-4140-8c8a-5715717c1915', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24-vda', 'timestamp': '2026-02-17T17:30:44.375680', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'instance-00000003', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '638e558e-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.716296584, 'message_signature': '9f96c1fef4aa40b4ad51a6d6ceb91c6bb4d07197acc841ee0b5cbfdc76384aba'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24-sda', 'timestamp': '2026-02-17T17:30:44.375680', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'instance-00000003', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '638e631c-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.716296584, 'message_signature': '36d329446f4961e1b48e7741135d6d3c42c4812afb813a5f39ca01afea81acf6'}]}, 'timestamp': '2026-02-17 17:30:44.376394', '_unique_id': 'f70cca41a2c44200a58bc2c0c962545c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.377 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.378 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0fefc3e8-a73c-47d0-84d1-950a8797749f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-00000003-3ad24cf9-8612-4439-b021-bda5f2bddb24-tap69143768-44', 'timestamp': '2026-02-17T17:30:44.377996', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'tap69143768-44', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c1:bc:72', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69143768-44'}, 'message_id': '638eb286-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.705479495, 'message_signature': 'a574f8e90073638ef0b87f6dc973cc81b5c4409f6fb8c7d58c1d0d3f68cfb4bc'}]}, 'timestamp': '2026-02-17 17:30:44.378441', '_unique_id': 'f71803b9b3de4b288994a765878e843b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.379 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '948e019d-e726-4d18-89c8-c253ad660c7c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-00000003-3ad24cf9-8612-4439-b021-bda5f2bddb24-tap69143768-44', 'timestamp': '2026-02-17T17:30:44.379953', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'tap69143768-44', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c1:bc:72', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69143768-44'}, 'message_id': '638efd4a-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.705479495, 'message_signature': 'eb5444eedc314a61902b7485f6624957d5bae6eb1376207961168ac2dacef76d'}]}, 'timestamp': '2026-02-17 17:30:44.380379', '_unique_id': '15fa71b477df4eb4b41c7fbf2d1c96ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.380 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.381 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.400 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/memory.usage volume: 42.4765625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7bd3d4dd-3ff3-4f05-a006-527419251ec2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.4765625, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'timestamp': '2026-02-17T17:30:44.381855', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'instance-00000003', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '63922b00-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.765111866, 'message_signature': '721d06bdd288f061e8b44be27db29e207afaa7f6c6e3415b6d7e111030872d94'}]}, 'timestamp': '2026-02-17 17:30:44.401362', '_unique_id': '47b0571964b0477e9496d931f287d09f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.402 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.403 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.404 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.404 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2040077989>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2040077989>]
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.404 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.404 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.device.write.requests volume: 311 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.405 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c71c731f-e550-4e19-bbb0-289ab28f1197', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 311, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24-vda', 'timestamp': '2026-02-17T17:30:44.404765', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'instance-00000003', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6392c678-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.65569595, 'message_signature': '3206666056b2612628947e7a8812e906b6ec7c41b94ee9d79271e9d0ea9e00f9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24-sda', 'timestamp': '2026-02-17T17:30:44.404765', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'instance-00000003', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6392d85c-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.65569595, 'message_signature': '291e7fed22b51d2ec1d451d4442640ef18e27968e8fd1130d942078eb2e9d7dc'}]}, 'timestamp': '2026-02-17 17:30:44.405672', '_unique_id': 'c384a6f356a848e6a1efc6d07c0c07f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.406 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.407 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.407 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.device.read.bytes volume: 29772288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.408 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8966e8ea-7fbc-4b0e-bb67-e10a4dc5caae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29772288, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24-vda', 'timestamp': '2026-02-17T17:30:44.407934', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'instance-00000003', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '63934274-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.65569595, 'message_signature': '3d63ab5223bc5b21eb39a8f20c70b7f610662a3d894a9baf6d78102c2d52dd77'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24-sda', 'timestamp': '2026-02-17T17:30:44.407934', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'instance-00000003', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6393528c-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.65569595, 'message_signature': 'a5af51f5c92221873be5e35424ddb4678dff84d3cf209a61f6fe8e5de5232940'}]}, 'timestamp': '2026-02-17 17:30:44.408806', '_unique_id': '81ab32b668934999b387e4ef24908c35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.409 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.410 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.410 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.411 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2040077989>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2040077989>]
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.411 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.411 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/cpu volume: 10760000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b146f0bb-bee2-4317-95dc-8433f122d89b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10760000000, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'timestamp': '2026-02-17T17:30:44.411540', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'instance-00000003', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '6393ce6a-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.765111866, 'message_signature': '340a1c43b43f9a453b1277a6234eca3bc1ef441d9ee9df968588e7dc89dc8577'}]}, 'timestamp': '2026-02-17 17:30:44.411982', '_unique_id': '440c71dea5da4196b63f7f5654726769'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.412 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.413 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.414 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '741c0b7b-6d41-4a88-a710-44a1fce6b3e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-00000003-3ad24cf9-8612-4439-b021-bda5f2bddb24-tap69143768-44', 'timestamp': '2026-02-17T17:30:44.414075', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'tap69143768-44', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c1:bc:72', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69143768-44'}, 'message_id': '639431f2-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.705479495, 'message_signature': '0d3aefda51589da67ff4901cffcf596e590cdea91854bca37022dcc2a82e4fc1'}]}, 'timestamp': '2026-02-17 17:30:44.414545', '_unique_id': '8bd62c75fb394a31841317a4a6f28c3a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.415 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.416 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.416 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/network.outgoing.packets volume: 27 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f1ce914-b162-408a-8881-873b71eceb1d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 27, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-00000003-3ad24cf9-8612-4439-b021-bda5f2bddb24-tap69143768-44', 'timestamp': '2026-02-17T17:30:44.416647', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'tap69143768-44', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c1:bc:72', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69143768-44'}, 'message_id': '639495d4-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.705479495, 'message_signature': '01cd33a4e183f93de6b28053b5cbb13a671a7ca56e3d4a039d83faa84176ae38'}]}, 'timestamp': '2026-02-17 17:30:44.417164', '_unique_id': 'c36d02e287e1420da2c0d5f7e6545a4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.418 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.419 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.419 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.419 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2040077989>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2040077989>]
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.420 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.420 12 DEBUG ceilometer.compute.pollsters [-] 3ad24cf9-8612-4439-b021-bda5f2bddb24/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '091edc3e-c94e-435c-89f8-88bf46f606cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-00000003-3ad24cf9-8612-4439-b021-bda5f2bddb24-tap69143768-44', 'timestamp': '2026-02-17T17:30:44.420185', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2040077989', 'name': 'tap69143768-44', 'instance_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c1:bc:72', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69143768-44'}, 'message_id': '63952062-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3199.705479495, 'message_signature': '29743b08e7410ff337ed3c98363558a1e6db8855a3627347a3965624857c782c'}]}, 'timestamp': '2026-02-17 17:30:44.420647', '_unique_id': '3fa3c19803b14926b46f63cf78556800'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:30:44 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:30:44.421 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:30:44 compute-0 nova_compute[186479]: 2026-02-17 17:30:44.872 186483 DEBUG oslo_concurrency.lockutils [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "interface-3ad24cf9-8612-4439-b021-bda5f2bddb24-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:30:44 compute-0 nova_compute[186479]: 2026-02-17 17:30:44.873 186483 DEBUG oslo_concurrency.lockutils [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "interface-3ad24cf9-8612-4439-b021-bda5f2bddb24-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:30:44 compute-0 nova_compute[186479]: 2026-02-17 17:30:44.873 186483 DEBUG nova.objects.instance [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'flavor' on Instance uuid 3ad24cf9-8612-4439-b021-bda5f2bddb24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:30:45 compute-0 nova_compute[186479]: 2026-02-17 17:30:45.179 186483 DEBUG nova.objects.instance [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'pci_requests' on Instance uuid 3ad24cf9-8612-4439-b021-bda5f2bddb24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:30:45 compute-0 nova_compute[186479]: 2026-02-17 17:30:45.192 186483 DEBUG nova.network.neutron [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 17 17:30:46 compute-0 nova_compute[186479]: 2026-02-17 17:30:46.320 186483 DEBUG nova.policy [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 17 17:30:47 compute-0 nova_compute[186479]: 2026-02-17 17:30:47.017 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:48 compute-0 nova_compute[186479]: 2026-02-17 17:30:48.395 186483 DEBUG nova.network.neutron [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Successfully created port: 3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 17 17:30:48 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:48.457 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ee0cee2f-3200-4f1f-8903-57b18789347d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:30:48 compute-0 nova_compute[186479]: 2026-02-17 17:30:48.461 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:49 compute-0 nova_compute[186479]: 2026-02-17 17:30:49.534 186483 DEBUG nova.network.neutron [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Successfully updated port: 3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 17 17:30:49 compute-0 nova_compute[186479]: 2026-02-17 17:30:49.551 186483 DEBUG oslo_concurrency.lockutils [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "refresh_cache-3ad24cf9-8612-4439-b021-bda5f2bddb24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:30:49 compute-0 nova_compute[186479]: 2026-02-17 17:30:49.552 186483 DEBUG oslo_concurrency.lockutils [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquired lock "refresh_cache-3ad24cf9-8612-4439-b021-bda5f2bddb24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:30:49 compute-0 nova_compute[186479]: 2026-02-17 17:30:49.552 186483 DEBUG nova.network.neutron [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 17 17:30:49 compute-0 nova_compute[186479]: 2026-02-17 17:30:49.626 186483 DEBUG nova.compute.manager [req-2d618d2d-ac71-4a17-83c0-e5834600e097 req-5a7e2eb6-dab3-4735-93cd-4de28fa9b2da 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Received event network-changed-3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:30:49 compute-0 nova_compute[186479]: 2026-02-17 17:30:49.626 186483 DEBUG nova.compute.manager [req-2d618d2d-ac71-4a17-83c0-e5834600e097 req-5a7e2eb6-dab3-4735-93cd-4de28fa9b2da 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Refreshing instance network info cache due to event network-changed-3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:30:49 compute-0 nova_compute[186479]: 2026-02-17 17:30:49.627 186483 DEBUG oslo_concurrency.lockutils [req-2d618d2d-ac71-4a17-83c0-e5834600e097 req-5a7e2eb6-dab3-4735-93cd-4de28fa9b2da 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-3ad24cf9-8612-4439-b021-bda5f2bddb24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:30:51 compute-0 podman[216304]: 2026-02-17 17:30:51.707268348 +0000 UTC m=+0.050122734 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3)
Feb 17 17:30:51 compute-0 podman[216305]: 2026-02-17 17:30:51.738779944 +0000 UTC m=+0.072166903 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.821 186483 DEBUG nova.network.neutron [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Updating instance_info_cache with network_info: [{"id": "69143768-441a-4b58-8b86-b127b5cb10ba", "address": "fa:16:3e:c1:bc:72", "network": {"id": "4b74eb23-d2ba-4cd7-803b-057cc56db5a9", "bridge": "br-int", "label": "tempest-network-smoke--2145510902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69143768-44", "ovs_interfaceid": "69143768-441a-4b58-8b86-b127b5cb10ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "address": "fa:16:3e:55:b0:3d", "network": {"id": "da59558a-2562-48b1-800e-9e22eeba4e27", "bridge": "br-int", "label": "tempest-network-smoke--1981846795", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3d59e1-70", "ovs_interfaceid": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.839 186483 DEBUG oslo_concurrency.lockutils [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Releasing lock "refresh_cache-3ad24cf9-8612-4439-b021-bda5f2bddb24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.840 186483 DEBUG oslo_concurrency.lockutils [req-2d618d2d-ac71-4a17-83c0-e5834600e097 req-5a7e2eb6-dab3-4735-93cd-4de28fa9b2da 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-3ad24cf9-8612-4439-b021-bda5f2bddb24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.840 186483 DEBUG nova.network.neutron [req-2d618d2d-ac71-4a17-83c0-e5834600e097 req-5a7e2eb6-dab3-4735-93cd-4de28fa9b2da 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Refreshing network info cache for port 3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.843 186483 DEBUG nova.virt.libvirt.vif [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:30:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2040077989',display_name='tempest-TestNetworkBasicOps-server-2040077989',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2040077989',id=3,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKeVBbhOS69U2o/qdWVjQc/kA4LynViBOxubp8Mv3krMdHX8NlEfC9nO77wzBe+GMUKbYniP3l1YAkyRYnocZJo5PlPjUwprrqnBAmIy27Oc4hLoupP/GAxrvI7a1aFqGA==',key_name='tempest-TestNetworkBasicOps-971265848',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:30:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-j0cyv0oy',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:30:23Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=3ad24cf9-8612-4439-b021-bda5f2bddb24,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "address": "fa:16:3e:55:b0:3d", "network": {"id": "da59558a-2562-48b1-800e-9e22eeba4e27", "bridge": "br-int", "label": "tempest-network-smoke--1981846795", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3d59e1-70", "ovs_interfaceid": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.844 186483 DEBUG nova.network.os_vif_util [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "address": "fa:16:3e:55:b0:3d", "network": {"id": "da59558a-2562-48b1-800e-9e22eeba4e27", "bridge": "br-int", "label": "tempest-network-smoke--1981846795", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3d59e1-70", "ovs_interfaceid": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.844 186483 DEBUG nova.network.os_vif_util [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:b0:3d,bridge_name='br-int',has_traffic_filtering=True,id=3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6,network=Network(da59558a-2562-48b1-800e-9e22eeba4e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c3d59e1-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.845 186483 DEBUG os_vif [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:b0:3d,bridge_name='br-int',has_traffic_filtering=True,id=3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6,network=Network(da59558a-2562-48b1-800e-9e22eeba4e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c3d59e1-70') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.845 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.845 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.846 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.848 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.849 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c3d59e1-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.849 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3c3d59e1-70, col_values=(('external_ids', {'iface-id': '3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:55:b0:3d', 'vm-uuid': '3ad24cf9-8612-4439-b021-bda5f2bddb24'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:30:51 compute-0 NetworkManager[56323]: <info>  [1771349451.8518] manager: (tap3c3d59e1-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.851 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.859 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.862 186483 INFO os_vif [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:b0:3d,bridge_name='br-int',has_traffic_filtering=True,id=3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6,network=Network(da59558a-2562-48b1-800e-9e22eeba4e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c3d59e1-70')
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.863 186483 DEBUG nova.virt.libvirt.vif [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:30:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2040077989',display_name='tempest-TestNetworkBasicOps-server-2040077989',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2040077989',id=3,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKeVBbhOS69U2o/qdWVjQc/kA4LynViBOxubp8Mv3krMdHX8NlEfC9nO77wzBe+GMUKbYniP3l1YAkyRYnocZJo5PlPjUwprrqnBAmIy27Oc4hLoupP/GAxrvI7a1aFqGA==',key_name='tempest-TestNetworkBasicOps-971265848',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:30:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-j0cyv0oy',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:30:23Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=3ad24cf9-8612-4439-b021-bda5f2bddb24,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "address": "fa:16:3e:55:b0:3d", "network": {"id": "da59558a-2562-48b1-800e-9e22eeba4e27", "bridge": "br-int", "label": "tempest-network-smoke--1981846795", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3d59e1-70", "ovs_interfaceid": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.863 186483 DEBUG nova.network.os_vif_util [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "address": "fa:16:3e:55:b0:3d", "network": {"id": "da59558a-2562-48b1-800e-9e22eeba4e27", "bridge": "br-int", "label": "tempest-network-smoke--1981846795", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3d59e1-70", "ovs_interfaceid": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.864 186483 DEBUG nova.network.os_vif_util [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:b0:3d,bridge_name='br-int',has_traffic_filtering=True,id=3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6,network=Network(da59558a-2562-48b1-800e-9e22eeba4e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c3d59e1-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.867 186483 DEBUG nova.virt.libvirt.guest [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] attach device xml: <interface type="ethernet">
Feb 17 17:30:51 compute-0 nova_compute[186479]:   <mac address="fa:16:3e:55:b0:3d"/>
Feb 17 17:30:51 compute-0 nova_compute[186479]:   <model type="virtio"/>
Feb 17 17:30:51 compute-0 nova_compute[186479]:   <driver name="vhost" rx_queue_size="512"/>
Feb 17 17:30:51 compute-0 nova_compute[186479]:   <mtu size="1442"/>
Feb 17 17:30:51 compute-0 nova_compute[186479]:   <target dev="tap3c3d59e1-70"/>
Feb 17 17:30:51 compute-0 nova_compute[186479]: </interface>
Feb 17 17:30:51 compute-0 nova_compute[186479]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Feb 17 17:30:51 compute-0 NetworkManager[56323]: <info>  [1771349451.8799] manager: (tap3c3d59e1-70): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Feb 17 17:30:51 compute-0 kernel: tap3c3d59e1-70: entered promiscuous mode
Feb 17 17:30:51 compute-0 ovn_controller[96568]: 2026-02-17T17:30:51Z|00053|binding|INFO|Claiming lport 3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 for this chassis.
Feb 17 17:30:51 compute-0 ovn_controller[96568]: 2026-02-17T17:30:51Z|00054|binding|INFO|3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6: Claiming fa:16:3e:55:b0:3d 10.100.0.18
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.883 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.886 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:51.891 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:55:b0:3d 10.100.0.18'], port_security=['fa:16:3e:55:b0:3d 10.100.0.18'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-da59558a-2562-48b1-800e-9e22eeba4e27', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4ea7d121-7f67-4d41-b55a-38229e1e4d1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=957c8ef5-b1c2-4c22-b293-a07ba2afa11e, chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:30:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:51.892 105898 INFO neutron.agent.ovn.metadata.agent [-] Port 3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 in datapath da59558a-2562-48b1-800e-9e22eeba4e27 bound to our chassis
Feb 17 17:30:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:51.894 105898 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network da59558a-2562-48b1-800e-9e22eeba4e27
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.898 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:51 compute-0 ovn_controller[96568]: 2026-02-17T17:30:51Z|00055|binding|INFO|Setting lport 3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 ovn-installed in OVS
Feb 17 17:30:51 compute-0 ovn_controller[96568]: 2026-02-17T17:30:51Z|00056|binding|INFO|Setting lport 3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 up in Southbound
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.901 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:51.904 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[952e7f6c-39ec-45f0-85cd-615f49703519]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:51.907 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapda59558a-21 in ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 17 17:30:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:51.909 215208 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapda59558a-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 17 17:30:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:51.909 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[fb21e11a-5b04-48a5-8700-91b54276fa01]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:51.910 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[20843f30-b16c-45b7-959c-c43be546a505]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:51.921 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[31a6fbeb-7178-403f-8611-de9d6e220617]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:51 compute-0 systemd-udevd[216351]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:30:51 compute-0 NetworkManager[56323]: <info>  [1771349451.9364] device (tap3c3d59e1-70): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 17 17:30:51 compute-0 NetworkManager[56323]: <info>  [1771349451.9369] device (tap3c3d59e1-70): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 17 17:30:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:51.944 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[3968e897-1747-4487-99b9-5547b126f408]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.964 186483 DEBUG nova.virt.libvirt.driver [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.965 186483 DEBUG nova.virt.libvirt.driver [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.965 186483 DEBUG nova.virt.libvirt.driver [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No VIF found with MAC fa:16:3e:c1:bc:72, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.965 186483 DEBUG nova.virt.libvirt.driver [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No VIF found with MAC fa:16:3e:55:b0:3d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 17 17:30:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:51.970 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[3a4f6570-caa2-4aea-a997-0fd926490bf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:51 compute-0 NetworkManager[56323]: <info>  [1771349451.9761] manager: (tapda59558a-20): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Feb 17 17:30:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:51.975 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[0b789545-25ca-4d9b-a99b-15d3ee96490d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:51 compute-0 nova_compute[186479]: 2026-02-17 17:30:51.989 186483 DEBUG nova.virt.libvirt.guest [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:30:51 compute-0 nova_compute[186479]:   <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:30:51 compute-0 nova_compute[186479]:   <nova:name>tempest-TestNetworkBasicOps-server-2040077989</nova:name>
Feb 17 17:30:51 compute-0 nova_compute[186479]:   <nova:creationTime>2026-02-17 17:30:51</nova:creationTime>
Feb 17 17:30:51 compute-0 nova_compute[186479]:   <nova:flavor name="m1.nano">
Feb 17 17:30:51 compute-0 nova_compute[186479]:     <nova:memory>128</nova:memory>
Feb 17 17:30:51 compute-0 nova_compute[186479]:     <nova:disk>1</nova:disk>
Feb 17 17:30:51 compute-0 nova_compute[186479]:     <nova:swap>0</nova:swap>
Feb 17 17:30:51 compute-0 nova_compute[186479]:     <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:30:51 compute-0 nova_compute[186479]:     <nova:vcpus>1</nova:vcpus>
Feb 17 17:30:51 compute-0 nova_compute[186479]:   </nova:flavor>
Feb 17 17:30:51 compute-0 nova_compute[186479]:   <nova:owner>
Feb 17 17:30:51 compute-0 nova_compute[186479]:     <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:30:51 compute-0 nova_compute[186479]:     <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:30:51 compute-0 nova_compute[186479]:   </nova:owner>
Feb 17 17:30:51 compute-0 nova_compute[186479]:   <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:30:51 compute-0 nova_compute[186479]:   <nova:ports>
Feb 17 17:30:51 compute-0 nova_compute[186479]:     <nova:port uuid="69143768-441a-4b58-8b86-b127b5cb10ba">
Feb 17 17:30:51 compute-0 nova_compute[186479]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 17 17:30:51 compute-0 nova_compute[186479]:     </nova:port>
Feb 17 17:30:51 compute-0 nova_compute[186479]:     <nova:port uuid="3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6">
Feb 17 17:30:51 compute-0 nova_compute[186479]:       <nova:ip type="fixed" address="10.100.0.18" ipVersion="4"/>
Feb 17 17:30:51 compute-0 nova_compute[186479]:     </nova:port>
Feb 17 17:30:51 compute-0 nova_compute[186479]:   </nova:ports>
Feb 17 17:30:51 compute-0 nova_compute[186479]: </nova:instance>
Feb 17 17:30:51 compute-0 nova_compute[186479]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:52.003 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[6486cca1-578f-4ac8-a6f2-d7eec0aa97f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:52.007 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[a0539ce7-37a7-44f4-a9a8-bee5827bed74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:52 compute-0 nova_compute[186479]: 2026-02-17 17:30:52.020 186483 DEBUG oslo_concurrency.lockutils [None req-b0152f44-69c1-455b-9f7f-d3136f68c61c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "interface-3ad24cf9-8612-4439-b021-bda5f2bddb24-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 7.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:30:52 compute-0 NetworkManager[56323]: <info>  [1771349452.0262] device (tapda59558a-20): carrier: link connected
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:52.029 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[1a5108de-1ad5-4db8-8c46-66bf10a38db0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:52.046 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[2e4709ba-cac6-4402-b0a4-f6aa6275c301]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapda59558a-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:31:f8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 320733, 'reachable_time': 18763, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216377, 'error': None, 'target': 'ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:52.062 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[4af9f1e4-cdc2-459c-b362-dbce468e8f0c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee7:31f8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 320733, 'tstamp': 320733}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216378, 'error': None, 'target': 'ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:52.079 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[d67ce54c-c4e3-43d2-b918-b669aec4cfa6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapda59558a-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:31:f8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 320733, 'reachable_time': 18763, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216379, 'error': None, 'target': 'ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:52.105 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[4843c21c-c7fe-45a2-8b8a-ccf541ab8c8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:52.159 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[0a13436f-24ed-4e99-bade-b9effd775e5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:52.160 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda59558a-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:52.160 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:52.160 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapda59558a-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:30:52 compute-0 kernel: tapda59558a-20: entered promiscuous mode
Feb 17 17:30:52 compute-0 NetworkManager[56323]: <info>  [1771349452.1627] manager: (tapda59558a-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Feb 17 17:30:52 compute-0 nova_compute[186479]: 2026-02-17 17:30:52.162 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:52.164 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapda59558a-20, col_values=(('external_ids', {'iface-id': '75aa4176-233e-4ad4-aec2-d158fb26dfb3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:30:52 compute-0 ovn_controller[96568]: 2026-02-17T17:30:52Z|00057|binding|INFO|Releasing lport 75aa4176-233e-4ad4-aec2-d158fb26dfb3 from this chassis (sb_readonly=0)
Feb 17 17:30:52 compute-0 nova_compute[186479]: 2026-02-17 17:30:52.166 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:52.167 105898 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/da59558a-2562-48b1-800e-9e22eeba4e27.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/da59558a-2562-48b1-800e-9e22eeba4e27.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:52.167 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[f134628c-a41c-41dc-81bd-875057deccf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:52.168 105898 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]: global
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:     log         /dev/log local0 debug
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:     log-tag     haproxy-metadata-proxy-da59558a-2562-48b1-800e-9e22eeba4e27
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:     user        root
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:     group       root
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:     maxconn     1024
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:     pidfile     /var/lib/neutron/external/pids/da59558a-2562-48b1-800e-9e22eeba4e27.pid.haproxy
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:     daemon
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]: defaults
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:     log global
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:     mode http
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:     option httplog
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:     option dontlognull
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:     option http-server-close
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:     option forwardfor
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:     retries                 3
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:     timeout http-request    30s
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:     timeout connect         30s
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:     timeout client          32s
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:     timeout server          32s
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:     timeout http-keep-alive 30s
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]: listen listener
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:     bind 169.254.169.254:80
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:     server metadata /var/lib/neutron/metadata_proxy
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:     http-request add-header X-OVN-Network-ID da59558a-2562-48b1-800e-9e22eeba4e27
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 17 17:30:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:52.168 105898 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27', 'env', 'PROCESS_TAG=haproxy-da59558a-2562-48b1-800e-9e22eeba4e27', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/da59558a-2562-48b1-800e-9e22eeba4e27.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 17 17:30:52 compute-0 nova_compute[186479]: 2026-02-17 17:30:52.173 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:52 compute-0 nova_compute[186479]: 2026-02-17 17:30:52.469 186483 DEBUG nova.compute.manager [req-01866b9e-3816-4127-84e5-c725a5cf0f40 req-3eec37c1-c6a3-43d8-afca-3250ccd88f8d 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Received event network-vif-plugged-3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:30:52 compute-0 nova_compute[186479]: 2026-02-17 17:30:52.469 186483 DEBUG oslo_concurrency.lockutils [req-01866b9e-3816-4127-84e5-c725a5cf0f40 req-3eec37c1-c6a3-43d8-afca-3250ccd88f8d 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:30:52 compute-0 nova_compute[186479]: 2026-02-17 17:30:52.470 186483 DEBUG oslo_concurrency.lockutils [req-01866b9e-3816-4127-84e5-c725a5cf0f40 req-3eec37c1-c6a3-43d8-afca-3250ccd88f8d 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:30:52 compute-0 nova_compute[186479]: 2026-02-17 17:30:52.470 186483 DEBUG oslo_concurrency.lockutils [req-01866b9e-3816-4127-84e5-c725a5cf0f40 req-3eec37c1-c6a3-43d8-afca-3250ccd88f8d 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:30:52 compute-0 nova_compute[186479]: 2026-02-17 17:30:52.470 186483 DEBUG nova.compute.manager [req-01866b9e-3816-4127-84e5-c725a5cf0f40 req-3eec37c1-c6a3-43d8-afca-3250ccd88f8d 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] No waiting events found dispatching network-vif-plugged-3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:30:52 compute-0 nova_compute[186479]: 2026-02-17 17:30:52.470 186483 WARNING nova.compute.manager [req-01866b9e-3816-4127-84e5-c725a5cf0f40 req-3eec37c1-c6a3-43d8-afca-3250ccd88f8d 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Received unexpected event network-vif-plugged-3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 for instance with vm_state active and task_state None.
Feb 17 17:30:52 compute-0 podman[216411]: 2026-02-17 17:30:52.491994062 +0000 UTC m=+0.053776923 container create 9c3ac3ced3699d07233c349961002ba93b3068fdb2b0f1b3ebb9799fab1edf4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:30:52 compute-0 systemd[1]: Started libpod-conmon-9c3ac3ced3699d07233c349961002ba93b3068fdb2b0f1b3ebb9799fab1edf4d.scope.
Feb 17 17:30:52 compute-0 podman[216411]: 2026-02-17 17:30:52.462178345 +0000 UTC m=+0.023961216 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 17 17:30:52 compute-0 systemd[1]: Started libcrun container.
Feb 17 17:30:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94d99dbe44bf9155a7bcea3abf726ff622a233519b1e0d4865cf1778d9ff1cfa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 17 17:30:52 compute-0 podman[216411]: 2026-02-17 17:30:52.584559853 +0000 UTC m=+0.146342694 container init 9c3ac3ced3699d07233c349961002ba93b3068fdb2b0f1b3ebb9799fab1edf4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 17 17:30:52 compute-0 podman[216411]: 2026-02-17 17:30:52.592095413 +0000 UTC m=+0.153878234 container start 9c3ac3ced3699d07233c349961002ba93b3068fdb2b0f1b3ebb9799fab1edf4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:30:52 compute-0 neutron-haproxy-ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27[216424]: [NOTICE]   (216430) : New worker (216432) forked
Feb 17 17:30:52 compute-0 neutron-haproxy-ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27[216424]: [NOTICE]   (216430) : Loading success.
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.447 186483 DEBUG nova.network.neutron [req-2d618d2d-ac71-4a17-83c0-e5834600e097 req-5a7e2eb6-dab3-4735-93cd-4de28fa9b2da 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Updated VIF entry in instance network info cache for port 3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.448 186483 DEBUG nova.network.neutron [req-2d618d2d-ac71-4a17-83c0-e5834600e097 req-5a7e2eb6-dab3-4735-93cd-4de28fa9b2da 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Updating instance_info_cache with network_info: [{"id": "69143768-441a-4b58-8b86-b127b5cb10ba", "address": "fa:16:3e:c1:bc:72", "network": {"id": "4b74eb23-d2ba-4cd7-803b-057cc56db5a9", "bridge": "br-int", "label": "tempest-network-smoke--2145510902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69143768-44", "ovs_interfaceid": "69143768-441a-4b58-8b86-b127b5cb10ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "address": "fa:16:3e:55:b0:3d", "network": {"id": "da59558a-2562-48b1-800e-9e22eeba4e27", "bridge": "br-int", "label": "tempest-network-smoke--1981846795", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3d59e1-70", "ovs_interfaceid": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.462 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.465 186483 DEBUG oslo_concurrency.lockutils [req-2d618d2d-ac71-4a17-83c0-e5834600e097 req-5a7e2eb6-dab3-4735-93cd-4de28fa9b2da 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-3ad24cf9-8612-4439-b021-bda5f2bddb24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.843 186483 DEBUG oslo_concurrency.lockutils [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "interface-3ad24cf9-8612-4439-b021-bda5f2bddb24-3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.844 186483 DEBUG oslo_concurrency.lockutils [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "interface-3ad24cf9-8612-4439-b021-bda5f2bddb24-3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.862 186483 DEBUG nova.objects.instance [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'flavor' on Instance uuid 3ad24cf9-8612-4439-b021-bda5f2bddb24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.889 186483 DEBUG nova.virt.libvirt.vif [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:30:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2040077989',display_name='tempest-TestNetworkBasicOps-server-2040077989',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2040077989',id=3,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKeVBbhOS69U2o/qdWVjQc/kA4LynViBOxubp8Mv3krMdHX8NlEfC9nO77wzBe+GMUKbYniP3l1YAkyRYnocZJo5PlPjUwprrqnBAmIy27Oc4hLoupP/GAxrvI7a1aFqGA==',key_name='tempest-TestNetworkBasicOps-971265848',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:30:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-j0cyv0oy',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:30:23Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=3ad24cf9-8612-4439-b021-bda5f2bddb24,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "address": "fa:16:3e:55:b0:3d", "network": {"id": "da59558a-2562-48b1-800e-9e22eeba4e27", "bridge": "br-int", "label": "tempest-network-smoke--1981846795", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3d59e1-70", "ovs_interfaceid": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.889 186483 DEBUG nova.network.os_vif_util [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "address": "fa:16:3e:55:b0:3d", "network": {"id": "da59558a-2562-48b1-800e-9e22eeba4e27", "bridge": "br-int", "label": "tempest-network-smoke--1981846795", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3d59e1-70", "ovs_interfaceid": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.890 186483 DEBUG nova.network.os_vif_util [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:b0:3d,bridge_name='br-int',has_traffic_filtering=True,id=3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6,network=Network(da59558a-2562-48b1-800e-9e22eeba4e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c3d59e1-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.895 186483 DEBUG nova.virt.libvirt.guest [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:55:b0:3d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3c3d59e1-70"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.898 186483 DEBUG nova.virt.libvirt.guest [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:55:b0:3d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3c3d59e1-70"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.901 186483 DEBUG nova.virt.libvirt.driver [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Attempting to detach device tap3c3d59e1-70 from instance 3ad24cf9-8612-4439-b021-bda5f2bddb24 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.902 186483 DEBUG nova.virt.libvirt.guest [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] detach device xml: <interface type="ethernet">
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <mac address="fa:16:3e:55:b0:3d"/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <model type="virtio"/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <driver name="vhost" rx_queue_size="512"/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <mtu size="1442"/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <target dev="tap3c3d59e1-70"/>
Feb 17 17:30:53 compute-0 nova_compute[186479]: </interface>
Feb 17 17:30:53 compute-0 nova_compute[186479]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.909 186483 DEBUG nova.virt.libvirt.guest [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:55:b0:3d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3c3d59e1-70"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.915 186483 DEBUG nova.virt.libvirt.guest [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:55:b0:3d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3c3d59e1-70"/></interface>not found in domain: <domain type='kvm' id='3'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <name>instance-00000003</name>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <uuid>3ad24cf9-8612-4439-b021-bda5f2bddb24</uuid>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <metadata>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <nova:name>tempest-TestNetworkBasicOps-server-2040077989</nova:name>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <nova:creationTime>2026-02-17 17:30:51</nova:creationTime>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <nova:flavor name="m1.nano">
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <nova:memory>128</nova:memory>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <nova:disk>1</nova:disk>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <nova:swap>0</nova:swap>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <nova:vcpus>1</nova:vcpus>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </nova:flavor>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <nova:owner>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </nova:owner>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <nova:ports>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <nova:port uuid="69143768-441a-4b58-8b86-b127b5cb10ba">
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </nova:port>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <nova:port uuid="3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6">
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <nova:ip type="fixed" address="10.100.0.18" ipVersion="4"/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </nova:port>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </nova:ports>
Feb 17 17:30:53 compute-0 nova_compute[186479]: </nova:instance>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </metadata>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <memory unit='KiB'>131072</memory>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <currentMemory unit='KiB'>131072</currentMemory>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <vcpu placement='static'>1</vcpu>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <resource>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <partition>/machine</partition>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </resource>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <sysinfo type='smbios'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <system>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <entry name='manufacturer'>RDO</entry>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <entry name='product'>OpenStack Compute</entry>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <entry name='serial'>3ad24cf9-8612-4439-b021-bda5f2bddb24</entry>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <entry name='uuid'>3ad24cf9-8612-4439-b021-bda5f2bddb24</entry>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <entry name='family'>Virtual Machine</entry>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </system>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </sysinfo>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <os>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <boot dev='hd'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <smbios mode='sysinfo'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </os>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <features>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <acpi/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <apic/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <vmcoreinfo state='on'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </features>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <cpu mode='custom' match='exact' check='full'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <model fallback='forbid'>EPYC-Rome</model>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <vendor>AMD</vendor>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='x2apic'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='tsc-deadline'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='hypervisor'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='tsc_adjust'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='spec-ctrl'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='stibp'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='ssbd'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='cmp_legacy'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='overflow-recov'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='succor'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='ibrs'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='amd-ssbd'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='virt-ssbd'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='lbrv'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='tsc-scale'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='vmcb-clean'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='flushbyasid'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='pause-filter'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='pfthreshold'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='svme-addr-chk'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='lfence-always-serializing'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='xsaves'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='svm'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='topoext'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='npt'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='nrip-save'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <clock offset='utc'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <timer name='pit' tickpolicy='delay'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <timer name='rtc' tickpolicy='catchup'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <timer name='hpet' present='no'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </clock>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <on_poweroff>destroy</on_poweroff>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <on_reboot>restart</on_reboot>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <on_crash>destroy</on_crash>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <disk type='file' device='disk'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <driver name='qemu' type='qcow2' cache='none'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <source file='/var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/disk' index='2'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <backingStore type='file' index='3'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:         <format type='raw'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:         <source file='/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:         <backingStore/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       </backingStore>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target dev='vda' bus='virtio'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='virtio-disk0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <disk type='file' device='cdrom'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <driver name='qemu' type='raw' cache='none'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <source file='/var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.config' index='1'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <backingStore/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target dev='sda' bus='sata'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <readonly/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='sata0-0-0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='0' model='pcie-root'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pcie.0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='1' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='1' port='0x10'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.1'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='2' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='2' port='0x11'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.2'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='3' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='3' port='0x12'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.3'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='4' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='4' port='0x13'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.4'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='5' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='5' port='0x14'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.5'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='6' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='6' port='0x15'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.6'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='7' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='7' port='0x16'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.7'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='8' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='8' port='0x17'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.8'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='9' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='9' port='0x18'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.9'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='10' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='10' port='0x19'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.10'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='11' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='11' port='0x1a'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.11'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='12' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='12' port='0x1b'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.12'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='13' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='13' port='0x1c'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.13'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='14' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='14' port='0x1d'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.14'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='15' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='15' port='0x1e'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.15'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='16' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='16' port='0x1f'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.16'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='17' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='17' port='0x20'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.17'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='18' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='18' port='0x21'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.18'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='19' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='19' port='0x22'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.19'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='20' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='20' port='0x23'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.20'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='21' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='21' port='0x24'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.21'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='22' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='22' port='0x25'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.22'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='23' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='23' port='0x26'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.23'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='24' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='24' port='0x27'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.24'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='25' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='25' port='0x28'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.25'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-pci-bridge'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.26'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='usb' index='0' model='piix3-uhci'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='usb'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='sata' index='0'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='ide'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <interface type='ethernet'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <mac address='fa:16:3e:c1:bc:72'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target dev='tap69143768-44'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model type='virtio'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <driver name='vhost' rx_queue_size='512'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <mtu size='1442'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='net0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <interface type='ethernet'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <mac address='fa:16:3e:55:b0:3d'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target dev='tap3c3d59e1-70'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model type='virtio'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <driver name='vhost' rx_queue_size='512'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <mtu size='1442'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='net1'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <serial type='pty'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <source path='/dev/pts/0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <log file='/var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/console.log' append='off'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target type='isa-serial' port='0'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:         <model name='isa-serial'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       </target>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='serial0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </serial>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <console type='pty' tty='/dev/pts/0'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <source path='/dev/pts/0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <log file='/var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/console.log' append='off'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target type='serial' port='0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='serial0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </console>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <input type='tablet' bus='usb'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='input0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='usb' bus='0' port='1'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </input>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <input type='mouse' bus='ps2'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='input1'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </input>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <input type='keyboard' bus='ps2'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='input2'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </input>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <listen type='address' address='::0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </graphics>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <audio id='1' type='none'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <video>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model type='virtio' heads='1' primary='yes'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='video0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </video>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <watchdog model='itco' action='reset'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='watchdog0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </watchdog>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <memballoon model='virtio'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <stats period='10'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='balloon0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </memballoon>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <rng model='virtio'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <backend model='random'>/dev/urandom</backend>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='rng0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <label>system_u:system_r:svirt_t:s0:c206,c989</label>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c206,c989</imagelabel>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </seclabel>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <label>+107:+107</label>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <imagelabel>+107:+107</imagelabel>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </seclabel>
Feb 17 17:30:53 compute-0 nova_compute[186479]: </domain>
Feb 17 17:30:53 compute-0 nova_compute[186479]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.917 186483 INFO nova.virt.libvirt.driver [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully detached device tap3c3d59e1-70 from instance 3ad24cf9-8612-4439-b021-bda5f2bddb24 from the persistent domain config.
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.918 186483 DEBUG nova.virt.libvirt.driver [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] (1/8): Attempting to detach device tap3c3d59e1-70 with device alias net1 from instance 3ad24cf9-8612-4439-b021-bda5f2bddb24 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.919 186483 DEBUG nova.virt.libvirt.guest [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] detach device xml: <interface type="ethernet">
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <mac address="fa:16:3e:55:b0:3d"/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <model type="virtio"/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <driver name="vhost" rx_queue_size="512"/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <mtu size="1442"/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <target dev="tap3c3d59e1-70"/>
Feb 17 17:30:53 compute-0 nova_compute[186479]: </interface>
Feb 17 17:30:53 compute-0 nova_compute[186479]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Feb 17 17:30:53 compute-0 kernel: tap3c3d59e1-70 (unregistering): left promiscuous mode
Feb 17 17:30:53 compute-0 NetworkManager[56323]: <info>  [1771349453.9631] device (tap3c3d59e1-70): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 17 17:30:53 compute-0 ovn_controller[96568]: 2026-02-17T17:30:53Z|00058|binding|INFO|Releasing lport 3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 from this chassis (sb_readonly=0)
Feb 17 17:30:53 compute-0 ovn_controller[96568]: 2026-02-17T17:30:53Z|00059|binding|INFO|Setting lport 3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 down in Southbound
Feb 17 17:30:53 compute-0 ovn_controller[96568]: 2026-02-17T17:30:53Z|00060|binding|INFO|Removing iface tap3c3d59e1-70 ovn-installed in OVS
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.970 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.975 186483 DEBUG nova.virt.libvirt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Received event <DeviceRemovedEvent: 1771349453.9751287, 3ad24cf9-8612-4439-b021-bda5f2bddb24 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.976 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.977 186483 DEBUG nova.virt.libvirt.driver [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Start waiting for the detach event from libvirt for device tap3c3d59e1-70 with device alias net1 for instance 3ad24cf9-8612-4439-b021-bda5f2bddb24 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.982 186483 DEBUG nova.virt.libvirt.guest [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:55:b0:3d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3c3d59e1-70"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 17 17:30:53 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:53.986 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:55:b0:3d 10.100.0.18'], port_security=['fa:16:3e:55:b0:3d 10.100.0.18'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-da59558a-2562-48b1-800e-9e22eeba4e27', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4ea7d121-7f67-4d41-b55a-38229e1e4d1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=957c8ef5-b1c2-4c22-b293-a07ba2afa11e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.987 186483 DEBUG nova.virt.libvirt.guest [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:55:b0:3d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3c3d59e1-70"/></interface>not found in domain: <domain type='kvm' id='3'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <name>instance-00000003</name>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <uuid>3ad24cf9-8612-4439-b021-bda5f2bddb24</uuid>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <metadata>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <nova:name>tempest-TestNetworkBasicOps-server-2040077989</nova:name>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <nova:creationTime>2026-02-17 17:30:51</nova:creationTime>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <nova:flavor name="m1.nano">
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <nova:memory>128</nova:memory>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <nova:disk>1</nova:disk>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <nova:swap>0</nova:swap>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <nova:vcpus>1</nova:vcpus>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </nova:flavor>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <nova:owner>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </nova:owner>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <nova:ports>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <nova:port uuid="69143768-441a-4b58-8b86-b127b5cb10ba">
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </nova:port>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <nova:port uuid="3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6">
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <nova:ip type="fixed" address="10.100.0.18" ipVersion="4"/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </nova:port>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </nova:ports>
Feb 17 17:30:53 compute-0 nova_compute[186479]: </nova:instance>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </metadata>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <memory unit='KiB'>131072</memory>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <currentMemory unit='KiB'>131072</currentMemory>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <vcpu placement='static'>1</vcpu>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <resource>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <partition>/machine</partition>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </resource>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <sysinfo type='smbios'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <system>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <entry name='manufacturer'>RDO</entry>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <entry name='product'>OpenStack Compute</entry>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <entry name='serial'>3ad24cf9-8612-4439-b021-bda5f2bddb24</entry>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <entry name='uuid'>3ad24cf9-8612-4439-b021-bda5f2bddb24</entry>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <entry name='family'>Virtual Machine</entry>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </system>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </sysinfo>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <os>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <boot dev='hd'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <smbios mode='sysinfo'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </os>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <features>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <acpi/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <apic/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <vmcoreinfo state='on'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </features>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <cpu mode='custom' match='exact' check='full'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <model fallback='forbid'>EPYC-Rome</model>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <vendor>AMD</vendor>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='x2apic'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='tsc-deadline'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='hypervisor'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='tsc_adjust'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='spec-ctrl'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='stibp'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='ssbd'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='cmp_legacy'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='overflow-recov'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='succor'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='ibrs'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='amd-ssbd'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='virt-ssbd'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='lbrv'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='tsc-scale'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='vmcb-clean'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='flushbyasid'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='pause-filter'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='pfthreshold'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='svme-addr-chk'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='lfence-always-serializing'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='xsaves'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='svm'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='require' name='topoext'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='npt'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='nrip-save'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <clock offset='utc'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <timer name='pit' tickpolicy='delay'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <timer name='rtc' tickpolicy='catchup'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <timer name='hpet' present='no'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </clock>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <on_poweroff>destroy</on_poweroff>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <on_reboot>restart</on_reboot>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <on_crash>destroy</on_crash>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <disk type='file' device='disk'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <driver name='qemu' type='qcow2' cache='none'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <source file='/var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/disk' index='2'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <backingStore type='file' index='3'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:         <format type='raw'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:         <source file='/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:         <backingStore/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       </backingStore>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target dev='vda' bus='virtio'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='virtio-disk0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <disk type='file' device='cdrom'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <driver name='qemu' type='raw' cache='none'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <source file='/var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.config' index='1'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <backingStore/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target dev='sda' bus='sata'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <readonly/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='sata0-0-0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='0' model='pcie-root'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pcie.0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='1' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='1' port='0x10'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.1'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='2' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='2' port='0x11'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.2'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='3' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='3' port='0x12'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.3'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='4' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='4' port='0x13'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.4'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='5' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='5' port='0x14'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.5'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='6' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='6' port='0x15'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.6'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='7' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='7' port='0x16'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.7'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='8' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='8' port='0x17'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.8'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='9' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='9' port='0x18'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.9'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='10' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='10' port='0x19'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.10'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='11' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='11' port='0x1a'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.11'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='12' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='12' port='0x1b'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.12'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='13' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='13' port='0x1c'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.13'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='14' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='14' port='0x1d'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.14'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='15' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='15' port='0x1e'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.15'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='16' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='16' port='0x1f'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.16'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='17' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='17' port='0x20'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.17'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='18' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='18' port='0x21'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.18'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='19' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='19' port='0x22'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.19'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='20' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='20' port='0x23'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.20'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='21' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='21' port='0x24'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.21'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='22' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='22' port='0x25'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.22'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='23' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='23' port='0x26'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.23'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='24' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='24' port='0x27'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.24'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='25' model='pcie-root-port'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target chassis='25' port='0x28'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.25'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model name='pcie-pci-bridge'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='pci.26'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='usb' index='0' model='piix3-uhci'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='usb'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <controller type='sata' index='0'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='ide'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <interface type='ethernet'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <mac address='fa:16:3e:c1:bc:72'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target dev='tap69143768-44'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model type='virtio'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <driver name='vhost' rx_queue_size='512'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <mtu size='1442'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='net0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <serial type='pty'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <source path='/dev/pts/0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <log file='/var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/console.log' append='off'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target type='isa-serial' port='0'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:         <model name='isa-serial'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       </target>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='serial0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </serial>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <console type='pty' tty='/dev/pts/0'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <source path='/dev/pts/0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <log file='/var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/console.log' append='off'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <target type='serial' port='0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='serial0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </console>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <input type='tablet' bus='usb'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='input0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='usb' bus='0' port='1'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </input>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <input type='mouse' bus='ps2'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='input1'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </input>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <input type='keyboard' bus='ps2'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='input2'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </input>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <listen type='address' address='::0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </graphics>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <audio id='1' type='none'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <video>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <model type='virtio' heads='1' primary='yes'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='video0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </video>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <watchdog model='itco' action='reset'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='watchdog0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </watchdog>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <memballoon model='virtio'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <stats period='10'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='balloon0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </memballoon>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <rng model='virtio'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <backend model='random'>/dev/urandom</backend>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <alias name='rng0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <label>system_u:system_r:svirt_t:s0:c206,c989</label>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c206,c989</imagelabel>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </seclabel>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <label>+107:+107</label>
Feb 17 17:30:53 compute-0 nova_compute[186479]:     <imagelabel>+107:+107</imagelabel>
Feb 17 17:30:53 compute-0 nova_compute[186479]:   </seclabel>
Feb 17 17:30:53 compute-0 nova_compute[186479]: </domain>
Feb 17 17:30:53 compute-0 nova_compute[186479]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.987 186483 INFO nova.virt.libvirt.driver [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully detached device tap3c3d59e1-70 from instance 3ad24cf9-8612-4439-b021-bda5f2bddb24 from the live domain config.
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.988 186483 DEBUG nova.virt.libvirt.vif [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:30:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2040077989',display_name='tempest-TestNetworkBasicOps-server-2040077989',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2040077989',id=3,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKeVBbhOS69U2o/qdWVjQc/kA4LynViBOxubp8Mv3krMdHX8NlEfC9nO77wzBe+GMUKbYniP3l1YAkyRYnocZJo5PlPjUwprrqnBAmIy27Oc4hLoupP/GAxrvI7a1aFqGA==',key_name='tempest-TestNetworkBasicOps-971265848',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:30:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-j0cyv0oy',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:30:23Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=3ad24cf9-8612-4439-b021-bda5f2bddb24,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "address": "fa:16:3e:55:b0:3d", "network": {"id": "da59558a-2562-48b1-800e-9e22eeba4e27", "bridge": "br-int", "label": "tempest-network-smoke--1981846795", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3d59e1-70", "ovs_interfaceid": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.988 186483 DEBUG nova.network.os_vif_util [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "address": "fa:16:3e:55:b0:3d", "network": {"id": "da59558a-2562-48b1-800e-9e22eeba4e27", "bridge": "br-int", "label": "tempest-network-smoke--1981846795", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3d59e1-70", "ovs_interfaceid": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:30:53 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:53.988 105898 INFO neutron.agent.ovn.metadata.agent [-] Port 3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 in datapath da59558a-2562-48b1-800e-9e22eeba4e27 unbound from our chassis
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.989 186483 DEBUG nova.network.os_vif_util [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:b0:3d,bridge_name='br-int',has_traffic_filtering=True,id=3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6,network=Network(da59558a-2562-48b1-800e-9e22eeba4e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c3d59e1-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.989 186483 DEBUG os_vif [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:b0:3d,bridge_name='br-int',has_traffic_filtering=True,id=3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6,network=Network(da59558a-2562-48b1-800e-9e22eeba4e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c3d59e1-70') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 17 17:30:53 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:53.990 105898 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network da59558a-2562-48b1-800e-9e22eeba4e27, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.992 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:53 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:53.992 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[a3367f85-d1d3-4fa3-9d02-40143a301057]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.992 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c3d59e1-70, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:30:53 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:53.992 105898 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27 namespace which is not needed anymore
Feb 17 17:30:53 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.996 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:53.999 186483 INFO os_vif [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:b0:3d,bridge_name='br-int',has_traffic_filtering=True,id=3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6,network=Network(da59558a-2562-48b1-800e-9e22eeba4e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c3d59e1-70')
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:54.000 186483 DEBUG nova.virt.libvirt.guest [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:30:54 compute-0 nova_compute[186479]:   <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:30:54 compute-0 nova_compute[186479]:   <nova:name>tempest-TestNetworkBasicOps-server-2040077989</nova:name>
Feb 17 17:30:54 compute-0 nova_compute[186479]:   <nova:creationTime>2026-02-17 17:30:54</nova:creationTime>
Feb 17 17:30:54 compute-0 nova_compute[186479]:   <nova:flavor name="m1.nano">
Feb 17 17:30:54 compute-0 nova_compute[186479]:     <nova:memory>128</nova:memory>
Feb 17 17:30:54 compute-0 nova_compute[186479]:     <nova:disk>1</nova:disk>
Feb 17 17:30:54 compute-0 nova_compute[186479]:     <nova:swap>0</nova:swap>
Feb 17 17:30:54 compute-0 nova_compute[186479]:     <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:30:54 compute-0 nova_compute[186479]:     <nova:vcpus>1</nova:vcpus>
Feb 17 17:30:54 compute-0 nova_compute[186479]:   </nova:flavor>
Feb 17 17:30:54 compute-0 nova_compute[186479]:   <nova:owner>
Feb 17 17:30:54 compute-0 nova_compute[186479]:     <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:30:54 compute-0 nova_compute[186479]:     <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:30:54 compute-0 nova_compute[186479]:   </nova:owner>
Feb 17 17:30:54 compute-0 nova_compute[186479]:   <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:30:54 compute-0 nova_compute[186479]:   <nova:ports>
Feb 17 17:30:54 compute-0 nova_compute[186479]:     <nova:port uuid="69143768-441a-4b58-8b86-b127b5cb10ba">
Feb 17 17:30:54 compute-0 nova_compute[186479]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 17 17:30:54 compute-0 nova_compute[186479]:     </nova:port>
Feb 17 17:30:54 compute-0 nova_compute[186479]:   </nova:ports>
Feb 17 17:30:54 compute-0 nova_compute[186479]: </nova:instance>
Feb 17 17:30:54 compute-0 nova_compute[186479]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Feb 17 17:30:54 compute-0 neutron-haproxy-ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27[216424]: [NOTICE]   (216430) : haproxy version is 2.8.14-c23fe91
Feb 17 17:30:54 compute-0 neutron-haproxy-ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27[216424]: [NOTICE]   (216430) : path to executable is /usr/sbin/haproxy
Feb 17 17:30:54 compute-0 neutron-haproxy-ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27[216424]: [WARNING]  (216430) : Exiting Master process...
Feb 17 17:30:54 compute-0 neutron-haproxy-ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27[216424]: [ALERT]    (216430) : Current worker (216432) exited with code 143 (Terminated)
Feb 17 17:30:54 compute-0 neutron-haproxy-ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27[216424]: [WARNING]  (216430) : All workers exited. Exiting... (0)
Feb 17 17:30:54 compute-0 systemd[1]: libpod-9c3ac3ced3699d07233c349961002ba93b3068fdb2b0f1b3ebb9799fab1edf4d.scope: Deactivated successfully.
Feb 17 17:30:54 compute-0 podman[216462]: 2026-02-17 17:30:54.12232781 +0000 UTC m=+0.048020043 container died 9c3ac3ced3699d07233c349961002ba93b3068fdb2b0f1b3ebb9799fab1edf4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:30:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9c3ac3ced3699d07233c349961002ba93b3068fdb2b0f1b3ebb9799fab1edf4d-userdata-shm.mount: Deactivated successfully.
Feb 17 17:30:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-94d99dbe44bf9155a7bcea3abf726ff622a233519b1e0d4865cf1778d9ff1cfa-merged.mount: Deactivated successfully.
Feb 17 17:30:54 compute-0 podman[216462]: 2026-02-17 17:30:54.162675108 +0000 UTC m=+0.088367331 container cleanup 9c3ac3ced3699d07233c349961002ba93b3068fdb2b0f1b3ebb9799fab1edf4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 17 17:30:54 compute-0 systemd[1]: libpod-conmon-9c3ac3ced3699d07233c349961002ba93b3068fdb2b0f1b3ebb9799fab1edf4d.scope: Deactivated successfully.
Feb 17 17:30:54 compute-0 podman[216493]: 2026-02-17 17:30:54.221047278 +0000 UTC m=+0.041039085 container remove 9c3ac3ced3699d07233c349961002ba93b3068fdb2b0f1b3ebb9799fab1edf4d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 17 17:30:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:54.225 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[1b52d0c4-6bf9-4672-ace7-63275971448a]: (4, ('Tue Feb 17 05:30:54 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27 (9c3ac3ced3699d07233c349961002ba93b3068fdb2b0f1b3ebb9799fab1edf4d)\n9c3ac3ced3699d07233c349961002ba93b3068fdb2b0f1b3ebb9799fab1edf4d\nTue Feb 17 05:30:54 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27 (9c3ac3ced3699d07233c349961002ba93b3068fdb2b0f1b3ebb9799fab1edf4d)\n9c3ac3ced3699d07233c349961002ba93b3068fdb2b0f1b3ebb9799fab1edf4d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:54.227 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[b9ca4b0d-5cff-4365-a310-29f1d491bddc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:54.228 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda59558a-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:54.231 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:54 compute-0 kernel: tapda59558a-20: left promiscuous mode
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:54.234 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:54.239 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[f77f985a-16e9-4c7f-8302-397be48951d1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:54.256 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[06a4cf48-df13-4ab6-bfe5-c46e4a2ee167]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:54.258 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[e19873b5-1920-4a9f-9fb2-585c6fdb9cd1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:54.275 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[89da5749-e1c3-4f81-8e46-22f3101a5b63]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 320727, 'reachable_time': 19304, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216507, 'error': None, 'target': 'ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:54 compute-0 systemd[1]: run-netns-ovnmeta\x2dda59558a\x2d2562\x2d48b1\x2d800e\x2d9e22eeba4e27.mount: Deactivated successfully.
Feb 17 17:30:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:54.278 106424 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-da59558a-2562-48b1-800e-9e22eeba4e27 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 17 17:30:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:54.279 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[67caac5d-63c3-4560-8a4b-f7699eac6d28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:54.568 186483 DEBUG nova.compute.manager [req-92ae081f-9d6d-4667-8591-d0190d424ddf req-70c509b3-32b1-4798-9b4f-7a5e5b3d3af7 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Received event network-vif-plugged-3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:54.568 186483 DEBUG oslo_concurrency.lockutils [req-92ae081f-9d6d-4667-8591-d0190d424ddf req-70c509b3-32b1-4798-9b4f-7a5e5b3d3af7 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:54.568 186483 DEBUG oslo_concurrency.lockutils [req-92ae081f-9d6d-4667-8591-d0190d424ddf req-70c509b3-32b1-4798-9b4f-7a5e5b3d3af7 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:54.569 186483 DEBUG oslo_concurrency.lockutils [req-92ae081f-9d6d-4667-8591-d0190d424ddf req-70c509b3-32b1-4798-9b4f-7a5e5b3d3af7 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:54.569 186483 DEBUG nova.compute.manager [req-92ae081f-9d6d-4667-8591-d0190d424ddf req-70c509b3-32b1-4798-9b4f-7a5e5b3d3af7 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] No waiting events found dispatching network-vif-plugged-3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:54.569 186483 WARNING nova.compute.manager [req-92ae081f-9d6d-4667-8591-d0190d424ddf req-70c509b3-32b1-4798-9b4f-7a5e5b3d3af7 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Received unexpected event network-vif-plugged-3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 for instance with vm_state active and task_state None.
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:54.569 186483 DEBUG nova.compute.manager [req-92ae081f-9d6d-4667-8591-d0190d424ddf req-70c509b3-32b1-4798-9b4f-7a5e5b3d3af7 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Received event network-vif-unplugged-3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:54.569 186483 DEBUG oslo_concurrency.lockutils [req-92ae081f-9d6d-4667-8591-d0190d424ddf req-70c509b3-32b1-4798-9b4f-7a5e5b3d3af7 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:54.569 186483 DEBUG oslo_concurrency.lockutils [req-92ae081f-9d6d-4667-8591-d0190d424ddf req-70c509b3-32b1-4798-9b4f-7a5e5b3d3af7 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:54.570 186483 DEBUG oslo_concurrency.lockutils [req-92ae081f-9d6d-4667-8591-d0190d424ddf req-70c509b3-32b1-4798-9b4f-7a5e5b3d3af7 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:54.570 186483 DEBUG nova.compute.manager [req-92ae081f-9d6d-4667-8591-d0190d424ddf req-70c509b3-32b1-4798-9b4f-7a5e5b3d3af7 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] No waiting events found dispatching network-vif-unplugged-3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:54.570 186483 WARNING nova.compute.manager [req-92ae081f-9d6d-4667-8591-d0190d424ddf req-70c509b3-32b1-4798-9b4f-7a5e5b3d3af7 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Received unexpected event network-vif-unplugged-3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 for instance with vm_state active and task_state None.
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:54.918 186483 DEBUG oslo_concurrency.lockutils [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "refresh_cache-3ad24cf9-8612-4439-b021-bda5f2bddb24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:54.918 186483 DEBUG oslo_concurrency.lockutils [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquired lock "refresh_cache-3ad24cf9-8612-4439-b021-bda5f2bddb24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:54.919 186483 DEBUG nova.network.neutron [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:54.955 186483 DEBUG nova.compute.manager [req-370e83ef-cc3c-40aa-ae3e-2bbcb9936a88 req-abc8a8f0-3b5c-4455-b13a-d1ea933746b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Received event network-vif-deleted-3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:54.956 186483 INFO nova.compute.manager [req-370e83ef-cc3c-40aa-ae3e-2bbcb9936a88 req-abc8a8f0-3b5c-4455-b13a-d1ea933746b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Neutron deleted interface 3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6; detaching it from the instance and deleting it from the info cache
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:54.956 186483 DEBUG nova.network.neutron [req-370e83ef-cc3c-40aa-ae3e-2bbcb9936a88 req-abc8a8f0-3b5c-4455-b13a-d1ea933746b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Updating instance_info_cache with network_info: [{"id": "69143768-441a-4b58-8b86-b127b5cb10ba", "address": "fa:16:3e:c1:bc:72", "network": {"id": "4b74eb23-d2ba-4cd7-803b-057cc56db5a9", "bridge": "br-int", "label": "tempest-network-smoke--2145510902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69143768-44", "ovs_interfaceid": "69143768-441a-4b58-8b86-b127b5cb10ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:30:54 compute-0 nova_compute[186479]: 2026-02-17 17:30:54.993 186483 DEBUG nova.objects.instance [req-370e83ef-cc3c-40aa-ae3e-2bbcb9936a88 req-abc8a8f0-3b5c-4455-b13a-d1ea933746b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lazy-loading 'system_metadata' on Instance uuid 3ad24cf9-8612-4439-b021-bda5f2bddb24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:30:55 compute-0 nova_compute[186479]: 2026-02-17 17:30:55.031 186483 DEBUG nova.objects.instance [req-370e83ef-cc3c-40aa-ae3e-2bbcb9936a88 req-abc8a8f0-3b5c-4455-b13a-d1ea933746b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lazy-loading 'flavor' on Instance uuid 3ad24cf9-8612-4439-b021-bda5f2bddb24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:30:55 compute-0 nova_compute[186479]: 2026-02-17 17:30:55.069 186483 DEBUG nova.virt.libvirt.vif [req-370e83ef-cc3c-40aa-ae3e-2bbcb9936a88 req-abc8a8f0-3b5c-4455-b13a-d1ea933746b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:30:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2040077989',display_name='tempest-TestNetworkBasicOps-server-2040077989',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2040077989',id=3,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKeVBbhOS69U2o/qdWVjQc/kA4LynViBOxubp8Mv3krMdHX8NlEfC9nO77wzBe+GMUKbYniP3l1YAkyRYnocZJo5PlPjUwprrqnBAmIy27Oc4hLoupP/GAxrvI7a1aFqGA==',key_name='tempest-TestNetworkBasicOps-971265848',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:30:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-j0cyv0oy',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:30:23Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=3ad24cf9-8612-4439-b021-bda5f2bddb24,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "address": "fa:16:3e:55:b0:3d", "network": {"id": "da59558a-2562-48b1-800e-9e22eeba4e27", "bridge": "br-int", "label": "tempest-network-smoke--1981846795", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3d59e1-70", "ovs_interfaceid": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 17 17:30:55 compute-0 nova_compute[186479]: 2026-02-17 17:30:55.070 186483 DEBUG nova.network.os_vif_util [req-370e83ef-cc3c-40aa-ae3e-2bbcb9936a88 req-abc8a8f0-3b5c-4455-b13a-d1ea933746b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Converting VIF {"id": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "address": "fa:16:3e:55:b0:3d", "network": {"id": "da59558a-2562-48b1-800e-9e22eeba4e27", "bridge": "br-int", "label": "tempest-network-smoke--1981846795", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3d59e1-70", "ovs_interfaceid": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:30:55 compute-0 nova_compute[186479]: 2026-02-17 17:30:55.070 186483 DEBUG nova.network.os_vif_util [req-370e83ef-cc3c-40aa-ae3e-2bbcb9936a88 req-abc8a8f0-3b5c-4455-b13a-d1ea933746b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:b0:3d,bridge_name='br-int',has_traffic_filtering=True,id=3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6,network=Network(da59558a-2562-48b1-800e-9e22eeba4e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c3d59e1-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:30:55 compute-0 nova_compute[186479]: 2026-02-17 17:30:55.073 186483 DEBUG nova.virt.libvirt.guest [req-370e83ef-cc3c-40aa-ae3e-2bbcb9936a88 req-abc8a8f0-3b5c-4455-b13a-d1ea933746b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:55:b0:3d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3c3d59e1-70"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 17 17:30:55 compute-0 nova_compute[186479]: 2026-02-17 17:30:55.079 186483 DEBUG nova.virt.libvirt.guest [req-370e83ef-cc3c-40aa-ae3e-2bbcb9936a88 req-abc8a8f0-3b5c-4455-b13a-d1ea933746b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:55:b0:3d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3c3d59e1-70"/></interface>not found in domain: <domain type='kvm' id='3'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <name>instance-00000003</name>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <uuid>3ad24cf9-8612-4439-b021-bda5f2bddb24</uuid>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <metadata>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <nova:name>tempest-TestNetworkBasicOps-server-2040077989</nova:name>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <nova:creationTime>2026-02-17 17:30:54</nova:creationTime>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <nova:flavor name="m1.nano">
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:memory>128</nova:memory>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:disk>1</nova:disk>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:swap>0</nova:swap>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:vcpus>1</nova:vcpus>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </nova:flavor>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <nova:owner>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </nova:owner>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <nova:ports>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:port uuid="69143768-441a-4b58-8b86-b127b5cb10ba">
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </nova:port>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </nova:ports>
Feb 17 17:30:55 compute-0 nova_compute[186479]: </nova:instance>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </metadata>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <memory unit='KiB'>131072</memory>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <currentMemory unit='KiB'>131072</currentMemory>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <vcpu placement='static'>1</vcpu>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <resource>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <partition>/machine</partition>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </resource>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <sysinfo type='smbios'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <system>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <entry name='manufacturer'>RDO</entry>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <entry name='product'>OpenStack Compute</entry>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <entry name='serial'>3ad24cf9-8612-4439-b021-bda5f2bddb24</entry>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <entry name='uuid'>3ad24cf9-8612-4439-b021-bda5f2bddb24</entry>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <entry name='family'>Virtual Machine</entry>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </system>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </sysinfo>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <os>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <boot dev='hd'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <smbios mode='sysinfo'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </os>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <features>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <acpi/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <apic/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <vmcoreinfo state='on'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </features>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <cpu mode='custom' match='exact' check='full'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <model fallback='forbid'>EPYC-Rome</model>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <vendor>AMD</vendor>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='x2apic'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='tsc-deadline'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='hypervisor'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='tsc_adjust'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='spec-ctrl'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='stibp'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='ssbd'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='cmp_legacy'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='overflow-recov'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='succor'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='ibrs'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='amd-ssbd'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='virt-ssbd'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='lbrv'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='tsc-scale'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='vmcb-clean'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='flushbyasid'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='pause-filter'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='pfthreshold'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='svme-addr-chk'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='lfence-always-serializing'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='xsaves'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='svm'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='topoext'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='npt'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='nrip-save'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <clock offset='utc'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <timer name='pit' tickpolicy='delay'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <timer name='rtc' tickpolicy='catchup'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <timer name='hpet' present='no'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </clock>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <on_poweroff>destroy</on_poweroff>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <on_reboot>restart</on_reboot>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <on_crash>destroy</on_crash>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <disk type='file' device='disk'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <driver name='qemu' type='qcow2' cache='none'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <source file='/var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/disk' index='2'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <backingStore type='file' index='3'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:         <format type='raw'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:         <source file='/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:         <backingStore/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       </backingStore>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target dev='vda' bus='virtio'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='virtio-disk0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <disk type='file' device='cdrom'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <driver name='qemu' type='raw' cache='none'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <source file='/var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.config' index='1'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <backingStore/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target dev='sda' bus='sata'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <readonly/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='sata0-0-0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='0' model='pcie-root'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pcie.0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='1' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='1' port='0x10'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.1'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='2' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='2' port='0x11'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.2'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='3' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='3' port='0x12'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.3'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='4' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='4' port='0x13'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.4'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='5' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='5' port='0x14'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.5'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='6' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='6' port='0x15'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.6'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='7' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='7' port='0x16'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.7'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='8' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='8' port='0x17'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.8'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='9' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='9' port='0x18'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.9'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='10' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='10' port='0x19'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.10'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='11' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='11' port='0x1a'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.11'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='12' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='12' port='0x1b'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.12'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='13' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='13' port='0x1c'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.13'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='14' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='14' port='0x1d'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.14'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='15' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='15' port='0x1e'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.15'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='16' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='16' port='0x1f'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.16'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='17' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='17' port='0x20'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.17'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='18' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='18' port='0x21'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.18'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='19' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='19' port='0x22'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.19'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='20' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='20' port='0x23'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.20'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='21' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='21' port='0x24'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.21'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='22' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='22' port='0x25'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.22'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='23' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='23' port='0x26'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.23'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='24' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='24' port='0x27'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.24'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='25' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='25' port='0x28'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.25'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-pci-bridge'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.26'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='usb' index='0' model='piix3-uhci'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='usb'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='sata' index='0'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='ide'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <interface type='ethernet'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <mac address='fa:16:3e:c1:bc:72'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target dev='tap69143768-44'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model type='virtio'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <driver name='vhost' rx_queue_size='512'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <mtu size='1442'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='net0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <serial type='pty'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <source path='/dev/pts/0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <log file='/var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/console.log' append='off'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target type='isa-serial' port='0'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:         <model name='isa-serial'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       </target>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='serial0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </serial>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <console type='pty' tty='/dev/pts/0'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <source path='/dev/pts/0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <log file='/var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/console.log' append='off'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target type='serial' port='0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='serial0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </console>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <input type='tablet' bus='usb'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='input0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='usb' bus='0' port='1'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </input>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <input type='mouse' bus='ps2'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='input1'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </input>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <input type='keyboard' bus='ps2'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='input2'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </input>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <listen type='address' address='::0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </graphics>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <audio id='1' type='none'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <video>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model type='virtio' heads='1' primary='yes'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='video0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </video>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <watchdog model='itco' action='reset'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='watchdog0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </watchdog>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <memballoon model='virtio'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <stats period='10'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='balloon0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </memballoon>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <rng model='virtio'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <backend model='random'>/dev/urandom</backend>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='rng0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <label>system_u:system_r:svirt_t:s0:c206,c989</label>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c206,c989</imagelabel>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </seclabel>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <label>+107:+107</label>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <imagelabel>+107:+107</imagelabel>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </seclabel>
Feb 17 17:30:55 compute-0 nova_compute[186479]: </domain>
Feb 17 17:30:55 compute-0 nova_compute[186479]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Feb 17 17:30:55 compute-0 nova_compute[186479]: 2026-02-17 17:30:55.079 186483 DEBUG nova.virt.libvirt.guest [req-370e83ef-cc3c-40aa-ae3e-2bbcb9936a88 req-abc8a8f0-3b5c-4455-b13a-d1ea933746b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:55:b0:3d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3c3d59e1-70"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 17 17:30:55 compute-0 nova_compute[186479]: 2026-02-17 17:30:55.085 186483 DEBUG nova.virt.libvirt.guest [req-370e83ef-cc3c-40aa-ae3e-2bbcb9936a88 req-abc8a8f0-3b5c-4455-b13a-d1ea933746b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:55:b0:3d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3c3d59e1-70"/></interface>not found in domain: <domain type='kvm' id='3'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <name>instance-00000003</name>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <uuid>3ad24cf9-8612-4439-b021-bda5f2bddb24</uuid>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <metadata>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <nova:name>tempest-TestNetworkBasicOps-server-2040077989</nova:name>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <nova:creationTime>2026-02-17 17:30:54</nova:creationTime>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <nova:flavor name="m1.nano">
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:memory>128</nova:memory>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:disk>1</nova:disk>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:swap>0</nova:swap>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:vcpus>1</nova:vcpus>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </nova:flavor>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <nova:owner>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </nova:owner>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <nova:ports>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:port uuid="69143768-441a-4b58-8b86-b127b5cb10ba">
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </nova:port>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </nova:ports>
Feb 17 17:30:55 compute-0 nova_compute[186479]: </nova:instance>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </metadata>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <memory unit='KiB'>131072</memory>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <currentMemory unit='KiB'>131072</currentMemory>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <vcpu placement='static'>1</vcpu>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <resource>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <partition>/machine</partition>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </resource>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <sysinfo type='smbios'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <system>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <entry name='manufacturer'>RDO</entry>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <entry name='product'>OpenStack Compute</entry>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <entry name='serial'>3ad24cf9-8612-4439-b021-bda5f2bddb24</entry>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <entry name='uuid'>3ad24cf9-8612-4439-b021-bda5f2bddb24</entry>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <entry name='family'>Virtual Machine</entry>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </system>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </sysinfo>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <os>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <boot dev='hd'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <smbios mode='sysinfo'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </os>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <features>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <acpi/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <apic/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <vmcoreinfo state='on'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </features>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <cpu mode='custom' match='exact' check='full'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <model fallback='forbid'>EPYC-Rome</model>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <vendor>AMD</vendor>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='x2apic'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='tsc-deadline'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='hypervisor'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='tsc_adjust'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='spec-ctrl'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='stibp'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='ssbd'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='cmp_legacy'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='overflow-recov'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='succor'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='ibrs'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='amd-ssbd'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='virt-ssbd'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='lbrv'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='tsc-scale'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='vmcb-clean'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='flushbyasid'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='pause-filter'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='pfthreshold'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='svme-addr-chk'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='lfence-always-serializing'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='xsaves'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='svm'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='require' name='topoext'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='npt'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='nrip-save'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <clock offset='utc'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <timer name='pit' tickpolicy='delay'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <timer name='rtc' tickpolicy='catchup'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <timer name='hpet' present='no'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </clock>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <on_poweroff>destroy</on_poweroff>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <on_reboot>restart</on_reboot>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <on_crash>destroy</on_crash>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <disk type='file' device='disk'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <driver name='qemu' type='qcow2' cache='none'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <source file='/var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/disk' index='2'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <backingStore type='file' index='3'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:         <format type='raw'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:         <source file='/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:         <backingStore/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       </backingStore>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target dev='vda' bus='virtio'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='virtio-disk0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <disk type='file' device='cdrom'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <driver name='qemu' type='raw' cache='none'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <source file='/var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/disk.config' index='1'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <backingStore/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target dev='sda' bus='sata'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <readonly/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='sata0-0-0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='0' model='pcie-root'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pcie.0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='1' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='1' port='0x10'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.1'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='2' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='2' port='0x11'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.2'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='3' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='3' port='0x12'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.3'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='4' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='4' port='0x13'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.4'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='5' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='5' port='0x14'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.5'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='6' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='6' port='0x15'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.6'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='7' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='7' port='0x16'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.7'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='8' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='8' port='0x17'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.8'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='9' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='9' port='0x18'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.9'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='10' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='10' port='0x19'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.10'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='11' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='11' port='0x1a'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.11'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='12' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='12' port='0x1b'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.12'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='13' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='13' port='0x1c'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.13'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='14' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='14' port='0x1d'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.14'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='15' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='15' port='0x1e'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.15'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='16' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='16' port='0x1f'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.16'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='17' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='17' port='0x20'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.17'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='18' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='18' port='0x21'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.18'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='19' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='19' port='0x22'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.19'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='20' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='20' port='0x23'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.20'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='21' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='21' port='0x24'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.21'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='22' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='22' port='0x25'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.22'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='23' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='23' port='0x26'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.23'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='24' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='24' port='0x27'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.24'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='25' model='pcie-root-port'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target chassis='25' port='0x28'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.25'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model name='pcie-pci-bridge'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='pci.26'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='usb' index='0' model='piix3-uhci'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='usb'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <controller type='sata' index='0'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='ide'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <interface type='ethernet'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <mac address='fa:16:3e:c1:bc:72'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target dev='tap69143768-44'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model type='virtio'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <driver name='vhost' rx_queue_size='512'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <mtu size='1442'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='net0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <serial type='pty'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <source path='/dev/pts/0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <log file='/var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/console.log' append='off'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target type='isa-serial' port='0'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:         <model name='isa-serial'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       </target>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='serial0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </serial>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <console type='pty' tty='/dev/pts/0'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <source path='/dev/pts/0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <log file='/var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24/console.log' append='off'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <target type='serial' port='0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='serial0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </console>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <input type='tablet' bus='usb'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='input0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='usb' bus='0' port='1'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </input>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <input type='mouse' bus='ps2'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='input1'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </input>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <input type='keyboard' bus='ps2'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='input2'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </input>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <listen type='address' address='::0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </graphics>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <audio id='1' type='none'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <video>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <model type='virtio' heads='1' primary='yes'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='video0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </video>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <watchdog model='itco' action='reset'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='watchdog0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </watchdog>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <memballoon model='virtio'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <stats period='10'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='balloon0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </memballoon>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <rng model='virtio'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <backend model='random'>/dev/urandom</backend>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <alias name='rng0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <label>system_u:system_r:svirt_t:s0:c206,c989</label>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c206,c989</imagelabel>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </seclabel>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <label>+107:+107</label>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <imagelabel>+107:+107</imagelabel>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </seclabel>
Feb 17 17:30:55 compute-0 nova_compute[186479]: </domain>
Feb 17 17:30:55 compute-0 nova_compute[186479]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Feb 17 17:30:55 compute-0 nova_compute[186479]: 2026-02-17 17:30:55.086 186483 WARNING nova.virt.libvirt.driver [req-370e83ef-cc3c-40aa-ae3e-2bbcb9936a88 req-abc8a8f0-3b5c-4455-b13a-d1ea933746b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Detaching interface fa:16:3e:55:b0:3d failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap3c3d59e1-70' not found.
Feb 17 17:30:55 compute-0 nova_compute[186479]: 2026-02-17 17:30:55.087 186483 DEBUG nova.virt.libvirt.vif [req-370e83ef-cc3c-40aa-ae3e-2bbcb9936a88 req-abc8a8f0-3b5c-4455-b13a-d1ea933746b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:30:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2040077989',display_name='tempest-TestNetworkBasicOps-server-2040077989',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2040077989',id=3,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKeVBbhOS69U2o/qdWVjQc/kA4LynViBOxubp8Mv3krMdHX8NlEfC9nO77wzBe+GMUKbYniP3l1YAkyRYnocZJo5PlPjUwprrqnBAmIy27Oc4hLoupP/GAxrvI7a1aFqGA==',key_name='tempest-TestNetworkBasicOps-971265848',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:30:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-j0cyv0oy',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:30:23Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=3ad24cf9-8612-4439-b021-bda5f2bddb24,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "address": "fa:16:3e:55:b0:3d", "network": {"id": "da59558a-2562-48b1-800e-9e22eeba4e27", "bridge": "br-int", "label": "tempest-network-smoke--1981846795", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3d59e1-70", "ovs_interfaceid": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 17 17:30:55 compute-0 nova_compute[186479]: 2026-02-17 17:30:55.088 186483 DEBUG nova.network.os_vif_util [req-370e83ef-cc3c-40aa-ae3e-2bbcb9936a88 req-abc8a8f0-3b5c-4455-b13a-d1ea933746b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Converting VIF {"id": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "address": "fa:16:3e:55:b0:3d", "network": {"id": "da59558a-2562-48b1-800e-9e22eeba4e27", "bridge": "br-int", "label": "tempest-network-smoke--1981846795", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c3d59e1-70", "ovs_interfaceid": "3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:30:55 compute-0 nova_compute[186479]: 2026-02-17 17:30:55.089 186483 DEBUG nova.network.os_vif_util [req-370e83ef-cc3c-40aa-ae3e-2bbcb9936a88 req-abc8a8f0-3b5c-4455-b13a-d1ea933746b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:b0:3d,bridge_name='br-int',has_traffic_filtering=True,id=3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6,network=Network(da59558a-2562-48b1-800e-9e22eeba4e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c3d59e1-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:30:55 compute-0 nova_compute[186479]: 2026-02-17 17:30:55.090 186483 DEBUG os_vif [req-370e83ef-cc3c-40aa-ae3e-2bbcb9936a88 req-abc8a8f0-3b5c-4455-b13a-d1ea933746b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:b0:3d,bridge_name='br-int',has_traffic_filtering=True,id=3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6,network=Network(da59558a-2562-48b1-800e-9e22eeba4e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c3d59e1-70') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 17 17:30:55 compute-0 nova_compute[186479]: 2026-02-17 17:30:55.091 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:55 compute-0 nova_compute[186479]: 2026-02-17 17:30:55.092 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c3d59e1-70, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:30:55 compute-0 nova_compute[186479]: 2026-02-17 17:30:55.092 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:30:55 compute-0 nova_compute[186479]: 2026-02-17 17:30:55.096 186483 INFO os_vif [req-370e83ef-cc3c-40aa-ae3e-2bbcb9936a88 req-abc8a8f0-3b5c-4455-b13a-d1ea933746b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:b0:3d,bridge_name='br-int',has_traffic_filtering=True,id=3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6,network=Network(da59558a-2562-48b1-800e-9e22eeba4e27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3c3d59e1-70')
Feb 17 17:30:55 compute-0 nova_compute[186479]: 2026-02-17 17:30:55.097 186483 DEBUG nova.virt.libvirt.guest [req-370e83ef-cc3c-40aa-ae3e-2bbcb9936a88 req-abc8a8f0-3b5c-4455-b13a-d1ea933746b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <nova:name>tempest-TestNetworkBasicOps-server-2040077989</nova:name>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <nova:creationTime>2026-02-17 17:30:55</nova:creationTime>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <nova:flavor name="m1.nano">
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:memory>128</nova:memory>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:disk>1</nova:disk>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:swap>0</nova:swap>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:vcpus>1</nova:vcpus>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </nova:flavor>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <nova:owner>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </nova:owner>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   <nova:ports>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     <nova:port uuid="69143768-441a-4b58-8b86-b127b5cb10ba">
Feb 17 17:30:55 compute-0 nova_compute[186479]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 17 17:30:55 compute-0 nova_compute[186479]:     </nova:port>
Feb 17 17:30:55 compute-0 nova_compute[186479]:   </nova:ports>
Feb 17 17:30:55 compute-0 nova_compute[186479]: </nova:instance>
Feb 17 17:30:55 compute-0 nova_compute[186479]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Feb 17 17:30:55 compute-0 podman[216508]: 2026-02-17 17:30:55.712683678 +0000 UTC m=+0.048407003 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 17 17:30:56 compute-0 ovn_controller[96568]: 2026-02-17T17:30:56Z|00061|binding|INFO|Releasing lport f26c418c-dcca-4701-be98-da0e994433df from this chassis (sb_readonly=0)
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.034 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.305 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.306 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.478 186483 INFO nova.network.neutron [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Port 3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.479 186483 DEBUG nova.network.neutron [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Updating instance_info_cache with network_info: [{"id": "69143768-441a-4b58-8b86-b127b5cb10ba", "address": "fa:16:3e:c1:bc:72", "network": {"id": "4b74eb23-d2ba-4cd7-803b-057cc56db5a9", "bridge": "br-int", "label": "tempest-network-smoke--2145510902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69143768-44", "ovs_interfaceid": "69143768-441a-4b58-8b86-b127b5cb10ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.498 186483 DEBUG oslo_concurrency.lockutils [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Releasing lock "refresh_cache-3ad24cf9-8612-4439-b021-bda5f2bddb24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:30:56 compute-0 sshd-session[216534]: Invalid user test from 209.38.233.161 port 36876
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.528 186483 DEBUG oslo_concurrency.lockutils [None req-a4e71129-2a4c-4760-8c39-5b7d50c61389 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "interface-3ad24cf9-8612-4439-b021-bda5f2bddb24-3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.684s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:30:56 compute-0 sshd-session[216534]: Connection closed by invalid user test 209.38.233.161 port 36876 [preauth]
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.663 186483 DEBUG nova.compute.manager [req-da98036c-44d6-4896-b4fa-c46cb8b0b0c7 req-39bc8ddd-8dad-458c-a36c-5508041d7408 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Received event network-vif-plugged-3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.663 186483 DEBUG oslo_concurrency.lockutils [req-da98036c-44d6-4896-b4fa-c46cb8b0b0c7 req-39bc8ddd-8dad-458c-a36c-5508041d7408 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.664 186483 DEBUG oslo_concurrency.lockutils [req-da98036c-44d6-4896-b4fa-c46cb8b0b0c7 req-39bc8ddd-8dad-458c-a36c-5508041d7408 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.664 186483 DEBUG oslo_concurrency.lockutils [req-da98036c-44d6-4896-b4fa-c46cb8b0b0c7 req-39bc8ddd-8dad-458c-a36c-5508041d7408 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.664 186483 DEBUG nova.compute.manager [req-da98036c-44d6-4896-b4fa-c46cb8b0b0c7 req-39bc8ddd-8dad-458c-a36c-5508041d7408 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] No waiting events found dispatching network-vif-plugged-3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.664 186483 WARNING nova.compute.manager [req-da98036c-44d6-4896-b4fa-c46cb8b0b0c7 req-39bc8ddd-8dad-458c-a36c-5508041d7408 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Received unexpected event network-vif-plugged-3c3d59e1-704b-430a-9b9a-1b4c9a31d9b6 for instance with vm_state active and task_state None.
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.843 186483 DEBUG oslo_concurrency.lockutils [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "3ad24cf9-8612-4439-b021-bda5f2bddb24" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.844 186483 DEBUG oslo_concurrency.lockutils [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.844 186483 DEBUG oslo_concurrency.lockutils [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.844 186483 DEBUG oslo_concurrency.lockutils [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.845 186483 DEBUG oslo_concurrency.lockutils [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.846 186483 INFO nova.compute.manager [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Terminating instance
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.847 186483 DEBUG nova.compute.manager [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 17 17:30:56 compute-0 kernel: tap69143768-44 (unregistering): left promiscuous mode
Feb 17 17:30:56 compute-0 NetworkManager[56323]: <info>  [1771349456.8713] device (tap69143768-44): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.872 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:56 compute-0 ovn_controller[96568]: 2026-02-17T17:30:56Z|00062|binding|INFO|Releasing lport 69143768-441a-4b58-8b86-b127b5cb10ba from this chassis (sb_readonly=0)
Feb 17 17:30:56 compute-0 ovn_controller[96568]: 2026-02-17T17:30:56Z|00063|binding|INFO|Setting lport 69143768-441a-4b58-8b86-b127b5cb10ba down in Southbound
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.878 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:56 compute-0 ovn_controller[96568]: 2026-02-17T17:30:56Z|00064|binding|INFO|Removing iface tap69143768-44 ovn-installed in OVS
Feb 17 17:30:56 compute-0 nova_compute[186479]: 2026-02-17 17:30:56.884 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:56.888 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:bc:72 10.100.0.12'], port_security=['fa:16:3e:c1:bc:72 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '3ad24cf9-8612-4439-b021-bda5f2bddb24', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b74eb23-d2ba-4cd7-803b-057cc56db5a9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a74e4f80-b041-475a-a4c7-2220e3eb2e06', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=24a37ac2-f3c1-40d9-9892-4196ddc52d6c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=69143768-441a-4b58-8b86-b127b5cb10ba) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:30:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:56.890 105898 INFO neutron.agent.ovn.metadata.agent [-] Port 69143768-441a-4b58-8b86-b127b5cb10ba in datapath 4b74eb23-d2ba-4cd7-803b-057cc56db5a9 unbound from our chassis
Feb 17 17:30:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:56.891 105898 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4b74eb23-d2ba-4cd7-803b-057cc56db5a9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 17 17:30:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:56.892 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[f282c00a-4f20-4afd-a00d-db646b53da5a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:56.893 105898 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9 namespace which is not needed anymore
Feb 17 17:30:56 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Deactivated successfully.
Feb 17 17:30:56 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Consumed 12.977s CPU time.
Feb 17 17:30:56 compute-0 systemd-machined[155877]: Machine qemu-3-instance-00000003 terminated.
Feb 17 17:30:57 compute-0 neutron-haproxy-ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9[216175]: [NOTICE]   (216179) : haproxy version is 2.8.14-c23fe91
Feb 17 17:30:57 compute-0 neutron-haproxy-ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9[216175]: [NOTICE]   (216179) : path to executable is /usr/sbin/haproxy
Feb 17 17:30:57 compute-0 neutron-haproxy-ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9[216175]: [WARNING]  (216179) : Exiting Master process...
Feb 17 17:30:57 compute-0 neutron-haproxy-ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9[216175]: [WARNING]  (216179) : Exiting Master process...
Feb 17 17:30:57 compute-0 neutron-haproxy-ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9[216175]: [ALERT]    (216179) : Current worker (216181) exited with code 143 (Terminated)
Feb 17 17:30:57 compute-0 neutron-haproxy-ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9[216175]: [WARNING]  (216179) : All workers exited. Exiting... (0)
Feb 17 17:30:57 compute-0 systemd[1]: libpod-b0ef569f2ba0f0e64b2af169b55f585645150c6dc9021aa373982dc657510f0e.scope: Deactivated successfully.
Feb 17 17:30:57 compute-0 podman[216560]: 2026-02-17 17:30:57.025447613 +0000 UTC m=+0.042441369 container died b0ef569f2ba0f0e64b2af169b55f585645150c6dc9021aa373982dc657510f0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 17 17:30:57 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b0ef569f2ba0f0e64b2af169b55f585645150c6dc9021aa373982dc657510f0e-userdata-shm.mount: Deactivated successfully.
Feb 17 17:30:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-ea28a5ff56d146f119a735a940acf054a14121a7c2a926cc49d5d76024385e5d-merged.mount: Deactivated successfully.
Feb 17 17:30:57 compute-0 podman[216560]: 2026-02-17 17:30:57.061733884 +0000 UTC m=+0.078727650 container cleanup b0ef569f2ba0f0e64b2af169b55f585645150c6dc9021aa373982dc657510f0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 17 17:30:57 compute-0 systemd[1]: libpod-conmon-b0ef569f2ba0f0e64b2af169b55f585645150c6dc9021aa373982dc657510f0e.scope: Deactivated successfully.
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.096 186483 INFO nova.virt.libvirt.driver [-] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Instance destroyed successfully.
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.097 186483 DEBUG nova.objects.instance [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'resources' on Instance uuid 3ad24cf9-8612-4439-b021-bda5f2bddb24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.108 186483 DEBUG nova.virt.libvirt.vif [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:30:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2040077989',display_name='tempest-TestNetworkBasicOps-server-2040077989',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2040077989',id=3,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKeVBbhOS69U2o/qdWVjQc/kA4LynViBOxubp8Mv3krMdHX8NlEfC9nO77wzBe+GMUKbYniP3l1YAkyRYnocZJo5PlPjUwprrqnBAmIy27Oc4hLoupP/GAxrvI7a1aFqGA==',key_name='tempest-TestNetworkBasicOps-971265848',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:30:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-j0cyv0oy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:30:23Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=3ad24cf9-8612-4439-b021-bda5f2bddb24,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "69143768-441a-4b58-8b86-b127b5cb10ba", "address": "fa:16:3e:c1:bc:72", "network": {"id": "4b74eb23-d2ba-4cd7-803b-057cc56db5a9", "bridge": "br-int", "label": "tempest-network-smoke--2145510902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69143768-44", "ovs_interfaceid": "69143768-441a-4b58-8b86-b127b5cb10ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.109 186483 DEBUG nova.network.os_vif_util [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "69143768-441a-4b58-8b86-b127b5cb10ba", "address": "fa:16:3e:c1:bc:72", "network": {"id": "4b74eb23-d2ba-4cd7-803b-057cc56db5a9", "bridge": "br-int", "label": "tempest-network-smoke--2145510902", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69143768-44", "ovs_interfaceid": "69143768-441a-4b58-8b86-b127b5cb10ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.110 186483 DEBUG nova.network.os_vif_util [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c1:bc:72,bridge_name='br-int',has_traffic_filtering=True,id=69143768-441a-4b58-8b86-b127b5cb10ba,network=Network(4b74eb23-d2ba-4cd7-803b-057cc56db5a9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69143768-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.110 186483 DEBUG os_vif [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:bc:72,bridge_name='br-int',has_traffic_filtering=True,id=69143768-441a-4b58-8b86-b127b5cb10ba,network=Network(4b74eb23-d2ba-4cd7-803b-057cc56db5a9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69143768-44') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.112 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.112 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69143768-44, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:30:57 compute-0 podman[216598]: 2026-02-17 17:30:57.139728496 +0000 UTC m=+0.057994602 container remove b0ef569f2ba0f0e64b2af169b55f585645150c6dc9021aa373982dc657510f0e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.158 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:57 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:57.160 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[33d30785-fc95-4538-94c4-2e3437f63a1e]: (4, ('Tue Feb 17 05:30:56 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9 (b0ef569f2ba0f0e64b2af169b55f585645150c6dc9021aa373982dc657510f0e)\nb0ef569f2ba0f0e64b2af169b55f585645150c6dc9021aa373982dc657510f0e\nTue Feb 17 05:30:57 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9 (b0ef569f2ba0f0e64b2af169b55f585645150c6dc9021aa373982dc657510f0e)\nb0ef569f2ba0f0e64b2af169b55f585645150c6dc9021aa373982dc657510f0e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.162 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:30:57 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:57.162 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[86008723-f34f-499d-9396-eb13c0c8d381]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:57 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:57.163 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b74eb23-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:30:57 compute-0 kernel: tap4b74eb23-d0: left promiscuous mode
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.165 186483 INFO os_vif [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:bc:72,bridge_name='br-int',has_traffic_filtering=True,id=69143768-441a-4b58-8b86-b127b5cb10ba,network=Network(4b74eb23-d2ba-4cd7-803b-057cc56db5a9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69143768-44')
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.166 186483 INFO nova.virt.libvirt.driver [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Deleting instance files /var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24_del
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.168 186483 INFO nova.virt.libvirt.driver [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Deletion of /var/lib/nova/instances/3ad24cf9-8612-4439-b021-bda5f2bddb24_del complete
Feb 17 17:30:57 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:57.171 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[ae8d4c91-4d34-4bac-9e16-3822b67ee9f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.172 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:57 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:57.186 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[b7b3ce21-4422-446a-be3e-9aa1772315b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:57 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:57.188 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[30f0f3a7-2f1a-41ec-a556-7000fe77e681]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:57 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:57.206 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[8b2169a3-b998-4021-9084-7495dc345fad]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 317787, 'reachable_time': 17519, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216624, 'error': None, 'target': 'ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:57 compute-0 systemd[1]: run-netns-ovnmeta\x2d4b74eb23\x2dd2ba\x2d4cd7\x2d803b\x2d057cc56db5a9.mount: Deactivated successfully.
Feb 17 17:30:57 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:57.209 106424 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4b74eb23-d2ba-4cd7-803b-057cc56db5a9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 17 17:30:57 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:30:57.209 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[fd5a6c08-b3a5-4e8d-a731-a83067de03fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.228 186483 INFO nova.compute.manager [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Took 0.38 seconds to destroy the instance on the hypervisor.
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.228 186483 DEBUG oslo.service.loopingcall [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.229 186483 DEBUG nova.compute.manager [-] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.229 186483 DEBUG nova.network.neutron [-] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.819 186483 DEBUG nova.network.neutron [-] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.839 186483 INFO nova.compute.manager [-] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Took 0.61 seconds to deallocate network for instance.
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.901 186483 DEBUG oslo_concurrency.lockutils [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.902 186483 DEBUG oslo_concurrency.lockutils [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.961 186483 DEBUG nova.compute.provider_tree [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:30:57 compute-0 nova_compute[186479]: 2026-02-17 17:30:57.979 186483 DEBUG nova.scheduler.client.report [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.002 186483 DEBUG oslo_concurrency.lockutils [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.025 186483 INFO nova.scheduler.client.report [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Deleted allocations for instance 3ad24cf9-8612-4439-b021-bda5f2bddb24
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.096 186483 DEBUG oslo_concurrency.lockutils [None req-66541030-f69f-4d38-8bd3-582f3e29c431 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.471 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.497 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.498 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.498 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.498 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.657 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.658 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5736MB free_disk=73.21100234985352GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.659 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.659 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.715 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.716 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.739 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.752 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.757 186483 DEBUG nova.compute.manager [req-fe4b8235-14a3-4eb3-840d-025c8cf5acba req-47033f98-b02c-46d1-baff-ccfc83392178 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Received event network-changed-69143768-441a-4b58-8b86-b127b5cb10ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.757 186483 DEBUG nova.compute.manager [req-fe4b8235-14a3-4eb3-840d-025c8cf5acba req-47033f98-b02c-46d1-baff-ccfc83392178 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Refreshing instance network info cache due to event network-changed-69143768-441a-4b58-8b86-b127b5cb10ba. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.757 186483 DEBUG oslo_concurrency.lockutils [req-fe4b8235-14a3-4eb3-840d-025c8cf5acba req-47033f98-b02c-46d1-baff-ccfc83392178 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-3ad24cf9-8612-4439-b021-bda5f2bddb24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.758 186483 DEBUG oslo_concurrency.lockutils [req-fe4b8235-14a3-4eb3-840d-025c8cf5acba req-47033f98-b02c-46d1-baff-ccfc83392178 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-3ad24cf9-8612-4439-b021-bda5f2bddb24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.758 186483 DEBUG nova.network.neutron [req-fe4b8235-14a3-4eb3-840d-025c8cf5acba req-47033f98-b02c-46d1-baff-ccfc83392178 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Refreshing network info cache for port 69143768-441a-4b58-8b86-b127b5cb10ba _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.770 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:30:58 compute-0 nova_compute[186479]: 2026-02-17 17:30:58.771 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:30:59 compute-0 nova_compute[186479]: 2026-02-17 17:30:59.288 186483 DEBUG nova.network.neutron [req-fe4b8235-14a3-4eb3-840d-025c8cf5acba req-47033f98-b02c-46d1-baff-ccfc83392178 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 17 17:30:59 compute-0 nova_compute[186479]: 2026-02-17 17:30:59.643 186483 DEBUG nova.network.neutron [req-fe4b8235-14a3-4eb3-840d-025c8cf5acba req-47033f98-b02c-46d1-baff-ccfc83392178 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Feb 17 17:30:59 compute-0 nova_compute[186479]: 2026-02-17 17:30:59.644 186483 DEBUG oslo_concurrency.lockutils [req-fe4b8235-14a3-4eb3-840d-025c8cf5acba req-47033f98-b02c-46d1-baff-ccfc83392178 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-3ad24cf9-8612-4439-b021-bda5f2bddb24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:30:59 compute-0 nova_compute[186479]: 2026-02-17 17:30:59.645 186483 DEBUG nova.compute.manager [req-fe4b8235-14a3-4eb3-840d-025c8cf5acba req-47033f98-b02c-46d1-baff-ccfc83392178 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Received event network-vif-unplugged-69143768-441a-4b58-8b86-b127b5cb10ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:30:59 compute-0 nova_compute[186479]: 2026-02-17 17:30:59.645 186483 DEBUG oslo_concurrency.lockutils [req-fe4b8235-14a3-4eb3-840d-025c8cf5acba req-47033f98-b02c-46d1-baff-ccfc83392178 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:30:59 compute-0 nova_compute[186479]: 2026-02-17 17:30:59.645 186483 DEBUG oslo_concurrency.lockutils [req-fe4b8235-14a3-4eb3-840d-025c8cf5acba req-47033f98-b02c-46d1-baff-ccfc83392178 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:30:59 compute-0 nova_compute[186479]: 2026-02-17 17:30:59.646 186483 DEBUG oslo_concurrency.lockutils [req-fe4b8235-14a3-4eb3-840d-025c8cf5acba req-47033f98-b02c-46d1-baff-ccfc83392178 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:30:59 compute-0 nova_compute[186479]: 2026-02-17 17:30:59.646 186483 DEBUG nova.compute.manager [req-fe4b8235-14a3-4eb3-840d-025c8cf5acba req-47033f98-b02c-46d1-baff-ccfc83392178 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] No waiting events found dispatching network-vif-unplugged-69143768-441a-4b58-8b86-b127b5cb10ba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:30:59 compute-0 nova_compute[186479]: 2026-02-17 17:30:59.646 186483 WARNING nova.compute.manager [req-fe4b8235-14a3-4eb3-840d-025c8cf5acba req-47033f98-b02c-46d1-baff-ccfc83392178 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Received unexpected event network-vif-unplugged-69143768-441a-4b58-8b86-b127b5cb10ba for instance with vm_state deleted and task_state None.
Feb 17 17:30:59 compute-0 nova_compute[186479]: 2026-02-17 17:30:59.646 186483 DEBUG nova.compute.manager [req-fe4b8235-14a3-4eb3-840d-025c8cf5acba req-47033f98-b02c-46d1-baff-ccfc83392178 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Received event network-vif-plugged-69143768-441a-4b58-8b86-b127b5cb10ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:30:59 compute-0 nova_compute[186479]: 2026-02-17 17:30:59.647 186483 DEBUG oslo_concurrency.lockutils [req-fe4b8235-14a3-4eb3-840d-025c8cf5acba req-47033f98-b02c-46d1-baff-ccfc83392178 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:30:59 compute-0 nova_compute[186479]: 2026-02-17 17:30:59.647 186483 DEBUG oslo_concurrency.lockutils [req-fe4b8235-14a3-4eb3-840d-025c8cf5acba req-47033f98-b02c-46d1-baff-ccfc83392178 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:30:59 compute-0 nova_compute[186479]: 2026-02-17 17:30:59.647 186483 DEBUG oslo_concurrency.lockutils [req-fe4b8235-14a3-4eb3-840d-025c8cf5acba req-47033f98-b02c-46d1-baff-ccfc83392178 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3ad24cf9-8612-4439-b021-bda5f2bddb24-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:30:59 compute-0 nova_compute[186479]: 2026-02-17 17:30:59.647 186483 DEBUG nova.compute.manager [req-fe4b8235-14a3-4eb3-840d-025c8cf5acba req-47033f98-b02c-46d1-baff-ccfc83392178 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] No waiting events found dispatching network-vif-plugged-69143768-441a-4b58-8b86-b127b5cb10ba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:30:59 compute-0 nova_compute[186479]: 2026-02-17 17:30:59.648 186483 WARNING nova.compute.manager [req-fe4b8235-14a3-4eb3-840d-025c8cf5acba req-47033f98-b02c-46d1-baff-ccfc83392178 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Received unexpected event network-vif-plugged-69143768-441a-4b58-8b86-b127b5cb10ba for instance with vm_state deleted and task_state None.
Feb 17 17:30:59 compute-0 nova_compute[186479]: 2026-02-17 17:30:59.648 186483 DEBUG nova.compute.manager [req-fe4b8235-14a3-4eb3-840d-025c8cf5acba req-47033f98-b02c-46d1-baff-ccfc83392178 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Received event network-vif-deleted-69143768-441a-4b58-8b86-b127b5cb10ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:30:59 compute-0 nova_compute[186479]: 2026-02-17 17:30:59.766 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:31:00 compute-0 nova_compute[186479]: 2026-02-17 17:31:00.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:31:01 compute-0 nova_compute[186479]: 2026-02-17 17:31:01.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:31:01 compute-0 nova_compute[186479]: 2026-02-17 17:31:01.305 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:31:01 compute-0 nova_compute[186479]: 2026-02-17 17:31:01.305 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 17 17:31:01 compute-0 nova_compute[186479]: 2026-02-17 17:31:01.324 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 17 17:31:02 compute-0 nova_compute[186479]: 2026-02-17 17:31:02.159 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:02 compute-0 nova_compute[186479]: 2026-02-17 17:31:02.925 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:02 compute-0 nova_compute[186479]: 2026-02-17 17:31:02.950 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:03 compute-0 nova_compute[186479]: 2026-02-17 17:31:03.473 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:04 compute-0 podman[216627]: 2026-02-17 17:31:04.748171937 +0000 UTC m=+0.089623982 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 17 17:31:04 compute-0 rsyslogd[1015]: imjournal: 5703 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Feb 17 17:31:07 compute-0 nova_compute[186479]: 2026-02-17 17:31:07.163 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:07 compute-0 podman[216654]: 2026-02-17 17:31:07.736913976 +0000 UTC m=+0.079510870 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 17 17:31:08 compute-0 nova_compute[186479]: 2026-02-17 17:31:08.474 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:10.949 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:31:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:10.949 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:31:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:10.949 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:31:12 compute-0 nova_compute[186479]: 2026-02-17 17:31:12.094 186483 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771349457.0934186, 3ad24cf9-8612-4439-b021-bda5f2bddb24 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:31:12 compute-0 nova_compute[186479]: 2026-02-17 17:31:12.095 186483 INFO nova.compute.manager [-] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] VM Stopped (Lifecycle Event)
Feb 17 17:31:12 compute-0 nova_compute[186479]: 2026-02-17 17:31:12.125 186483 DEBUG nova.compute.manager [None req-55c46022-3fad-4178-b111-3d5cbc11544b - - - - - -] [instance: 3ad24cf9-8612-4439-b021-bda5f2bddb24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:31:12 compute-0 nova_compute[186479]: 2026-02-17 17:31:12.166 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:13 compute-0 nova_compute[186479]: 2026-02-17 17:31:13.477 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:13 compute-0 podman[216680]: 2026-02-17 17:31:13.713820179 +0000 UTC m=+0.058248188 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1770267347, managed_by=edpm_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7)
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.078 186483 DEBUG oslo_concurrency.lockutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "af4e641d-d312-4f39-878b-fd7ddc3984df" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.079 186483 DEBUG oslo_concurrency.lockutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "af4e641d-d312-4f39-878b-fd7ddc3984df" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.096 186483 DEBUG nova.compute.manager [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.177 186483 DEBUG oslo_concurrency.lockutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.178 186483 DEBUG oslo_concurrency.lockutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.188 186483 DEBUG nova.virt.hardware [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.188 186483 INFO nova.compute.claims [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Claim successful on node compute-0.ctlplane.example.com
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.300 186483 DEBUG nova.compute.provider_tree [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.313 186483 DEBUG nova.scheduler.client.report [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.331 186483 DEBUG oslo_concurrency.lockutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.332 186483 DEBUG nova.compute.manager [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.371 186483 DEBUG nova.compute.manager [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.371 186483 DEBUG nova.network.neutron [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.387 186483 INFO nova.virt.libvirt.driver [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.407 186483 DEBUG nova.compute.manager [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.516 186483 DEBUG nova.compute.manager [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.518 186483 DEBUG nova.virt.libvirt.driver [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.518 186483 INFO nova.virt.libvirt.driver [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Creating image(s)
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.519 186483 DEBUG oslo_concurrency.lockutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "/var/lib/nova/instances/af4e641d-d312-4f39-878b-fd7ddc3984df/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.520 186483 DEBUG oslo_concurrency.lockutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/af4e641d-d312-4f39-878b-fd7ddc3984df/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.521 186483 DEBUG oslo_concurrency.lockutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/af4e641d-d312-4f39-878b-fd7ddc3984df/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.534 186483 DEBUG oslo_concurrency.processutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.591 186483 DEBUG oslo_concurrency.processutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.592 186483 DEBUG oslo_concurrency.lockutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.593 186483 DEBUG oslo_concurrency.lockutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.604 186483 DEBUG oslo_concurrency.processutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.657 186483 DEBUG oslo_concurrency.processutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.658 186483 DEBUG oslo_concurrency.processutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/af4e641d-d312-4f39-878b-fd7ddc3984df/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.692 186483 DEBUG oslo_concurrency.processutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/af4e641d-d312-4f39-878b-fd7ddc3984df/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.693 186483 DEBUG oslo_concurrency.lockutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.693 186483 DEBUG oslo_concurrency.processutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.742 186483 DEBUG oslo_concurrency.processutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.743 186483 DEBUG nova.virt.disk.api [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Checking if we can resize image /var/lib/nova/instances/af4e641d-d312-4f39-878b-fd7ddc3984df/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.744 186483 DEBUG oslo_concurrency.processutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af4e641d-d312-4f39-878b-fd7ddc3984df/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.796 186483 DEBUG oslo_concurrency.processutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af4e641d-d312-4f39-878b-fd7ddc3984df/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.797 186483 DEBUG nova.virt.disk.api [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Cannot resize image /var/lib/nova/instances/af4e641d-d312-4f39-878b-fd7ddc3984df/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.797 186483 DEBUG nova.objects.instance [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'migration_context' on Instance uuid af4e641d-d312-4f39-878b-fd7ddc3984df obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.813 186483 DEBUG nova.virt.libvirt.driver [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.813 186483 DEBUG nova.virt.libvirt.driver [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Ensure instance console log exists: /var/lib/nova/instances/af4e641d-d312-4f39-878b-fd7ddc3984df/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.814 186483 DEBUG oslo_concurrency.lockutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.814 186483 DEBUG oslo_concurrency.lockutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:31:15 compute-0 nova_compute[186479]: 2026-02-17 17:31:15.814 186483 DEBUG oslo_concurrency.lockutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:31:16 compute-0 nova_compute[186479]: 2026-02-17 17:31:16.335 186483 DEBUG nova.policy [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 17 17:31:17 compute-0 nova_compute[186479]: 2026-02-17 17:31:17.169 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:18 compute-0 nova_compute[186479]: 2026-02-17 17:31:18.383 186483 DEBUG nova.network.neutron [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Successfully created port: c8432488-9dc7-47f2-88da-abda34b62ca5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 17 17:31:18 compute-0 nova_compute[186479]: 2026-02-17 17:31:18.479 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:19 compute-0 nova_compute[186479]: 2026-02-17 17:31:19.059 186483 DEBUG nova.network.neutron [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Successfully updated port: c8432488-9dc7-47f2-88da-abda34b62ca5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 17 17:31:19 compute-0 nova_compute[186479]: 2026-02-17 17:31:19.078 186483 DEBUG oslo_concurrency.lockutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "refresh_cache-af4e641d-d312-4f39-878b-fd7ddc3984df" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:31:19 compute-0 nova_compute[186479]: 2026-02-17 17:31:19.078 186483 DEBUG oslo_concurrency.lockutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquired lock "refresh_cache-af4e641d-d312-4f39-878b-fd7ddc3984df" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:31:19 compute-0 nova_compute[186479]: 2026-02-17 17:31:19.078 186483 DEBUG nova.network.neutron [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 17 17:31:19 compute-0 nova_compute[186479]: 2026-02-17 17:31:19.136 186483 DEBUG nova.compute.manager [req-797d3b7a-0334-4e04-98be-3b8192abb7c5 req-a3218976-bee6-4288-8be3-d848bd703e20 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Received event network-changed-c8432488-9dc7-47f2-88da-abda34b62ca5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:31:19 compute-0 nova_compute[186479]: 2026-02-17 17:31:19.136 186483 DEBUG nova.compute.manager [req-797d3b7a-0334-4e04-98be-3b8192abb7c5 req-a3218976-bee6-4288-8be3-d848bd703e20 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Refreshing instance network info cache due to event network-changed-c8432488-9dc7-47f2-88da-abda34b62ca5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:31:19 compute-0 nova_compute[186479]: 2026-02-17 17:31:19.137 186483 DEBUG oslo_concurrency.lockutils [req-797d3b7a-0334-4e04-98be-3b8192abb7c5 req-a3218976-bee6-4288-8be3-d848bd703e20 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-af4e641d-d312-4f39-878b-fd7ddc3984df" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:31:19 compute-0 nova_compute[186479]: 2026-02-17 17:31:19.201 186483 DEBUG nova.network.neutron [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.364 186483 DEBUG nova.network.neutron [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Updating instance_info_cache with network_info: [{"id": "c8432488-9dc7-47f2-88da-abda34b62ca5", "address": "fa:16:3e:f2:86:09", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8432488-9d", "ovs_interfaceid": "c8432488-9dc7-47f2-88da-abda34b62ca5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.380 186483 DEBUG oslo_concurrency.lockutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Releasing lock "refresh_cache-af4e641d-d312-4f39-878b-fd7ddc3984df" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.380 186483 DEBUG nova.compute.manager [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Instance network_info: |[{"id": "c8432488-9dc7-47f2-88da-abda34b62ca5", "address": "fa:16:3e:f2:86:09", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8432488-9d", "ovs_interfaceid": "c8432488-9dc7-47f2-88da-abda34b62ca5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.380 186483 DEBUG oslo_concurrency.lockutils [req-797d3b7a-0334-4e04-98be-3b8192abb7c5 req-a3218976-bee6-4288-8be3-d848bd703e20 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-af4e641d-d312-4f39-878b-fd7ddc3984df" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.381 186483 DEBUG nova.network.neutron [req-797d3b7a-0334-4e04-98be-3b8192abb7c5 req-a3218976-bee6-4288-8be3-d848bd703e20 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Refreshing network info cache for port c8432488-9dc7-47f2-88da-abda34b62ca5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.383 186483 DEBUG nova.virt.libvirt.driver [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Start _get_guest_xml network_info=[{"id": "c8432488-9dc7-47f2-88da-abda34b62ca5", "address": "fa:16:3e:f2:86:09", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8432488-9d", "ovs_interfaceid": "c8432488-9dc7-47f2-88da-abda34b62ca5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_format': None, 'encryption_options': None, 'size': 0, 'encrypted': False, 'image_id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.387 186483 WARNING nova.virt.libvirt.driver [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.396 186483 DEBUG nova.virt.libvirt.host [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.397 186483 DEBUG nova.virt.libvirt.host [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.403 186483 DEBUG nova.virt.libvirt.host [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.404 186483 DEBUG nova.virt.libvirt.host [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.404 186483 DEBUG nova.virt.libvirt.driver [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.405 186483 DEBUG nova.virt.hardware [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-17T17:28:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='5ebd41ec-8360-4181-bce3-0c0dc586cdb2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.405 186483 DEBUG nova.virt.hardware [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.405 186483 DEBUG nova.virt.hardware [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.406 186483 DEBUG nova.virt.hardware [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.406 186483 DEBUG nova.virt.hardware [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.406 186483 DEBUG nova.virt.hardware [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.406 186483 DEBUG nova.virt.hardware [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.407 186483 DEBUG nova.virt.hardware [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.407 186483 DEBUG nova.virt.hardware [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.407 186483 DEBUG nova.virt.hardware [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.407 186483 DEBUG nova.virt.hardware [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.411 186483 DEBUG nova.virt.libvirt.vif [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:31:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-972534764',display_name='tempest-TestNetworkBasicOps-server-972534764',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-972534764',id=4,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKif6TcgbUhwZNSHA/7Zqj2ckTW5gy38G2PEojWfF3Gkej5Cj+1R9oPgDqLWMFj0xHsnWUhwl7YVr/EP5RRCVJHFnJIkLEAALNr4nUjt1M4lQ7PTSk0Axc9skXgk3GMq2g==',key_name='tempest-TestNetworkBasicOps-1576131339',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-n1mkh1rf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:31:15Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=af4e641d-d312-4f39-878b-fd7ddc3984df,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c8432488-9dc7-47f2-88da-abda34b62ca5", "address": "fa:16:3e:f2:86:09", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8432488-9d", "ovs_interfaceid": "c8432488-9dc7-47f2-88da-abda34b62ca5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.411 186483 DEBUG nova.network.os_vif_util [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "c8432488-9dc7-47f2-88da-abda34b62ca5", "address": "fa:16:3e:f2:86:09", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8432488-9d", "ovs_interfaceid": "c8432488-9dc7-47f2-88da-abda34b62ca5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.412 186483 DEBUG nova.network.os_vif_util [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:86:09,bridge_name='br-int',has_traffic_filtering=True,id=c8432488-9dc7-47f2-88da-abda34b62ca5,network=Network(6ea18f88-05f8-477a-96b4-268feae14237),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8432488-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.414 186483 DEBUG nova.objects.instance [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'pci_devices' on Instance uuid af4e641d-d312-4f39-878b-fd7ddc3984df obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.433 186483 DEBUG nova.virt.libvirt.driver [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] End _get_guest_xml xml=<domain type="kvm">
Feb 17 17:31:20 compute-0 nova_compute[186479]:   <uuid>af4e641d-d312-4f39-878b-fd7ddc3984df</uuid>
Feb 17 17:31:20 compute-0 nova_compute[186479]:   <name>instance-00000004</name>
Feb 17 17:31:20 compute-0 nova_compute[186479]:   <memory>131072</memory>
Feb 17 17:31:20 compute-0 nova_compute[186479]:   <vcpu>1</vcpu>
Feb 17 17:31:20 compute-0 nova_compute[186479]:   <metadata>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <nova:name>tempest-TestNetworkBasicOps-server-972534764</nova:name>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <nova:creationTime>2026-02-17 17:31:20</nova:creationTime>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <nova:flavor name="m1.nano">
Feb 17 17:31:20 compute-0 nova_compute[186479]:         <nova:memory>128</nova:memory>
Feb 17 17:31:20 compute-0 nova_compute[186479]:         <nova:disk>1</nova:disk>
Feb 17 17:31:20 compute-0 nova_compute[186479]:         <nova:swap>0</nova:swap>
Feb 17 17:31:20 compute-0 nova_compute[186479]:         <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:31:20 compute-0 nova_compute[186479]:         <nova:vcpus>1</nova:vcpus>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       </nova:flavor>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <nova:owner>
Feb 17 17:31:20 compute-0 nova_compute[186479]:         <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:31:20 compute-0 nova_compute[186479]:         <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       </nova:owner>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <nova:ports>
Feb 17 17:31:20 compute-0 nova_compute[186479]:         <nova:port uuid="c8432488-9dc7-47f2-88da-abda34b62ca5">
Feb 17 17:31:20 compute-0 nova_compute[186479]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:         </nova:port>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       </nova:ports>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     </nova:instance>
Feb 17 17:31:20 compute-0 nova_compute[186479]:   </metadata>
Feb 17 17:31:20 compute-0 nova_compute[186479]:   <sysinfo type="smbios">
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <system>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <entry name="manufacturer">RDO</entry>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <entry name="product">OpenStack Compute</entry>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <entry name="serial">af4e641d-d312-4f39-878b-fd7ddc3984df</entry>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <entry name="uuid">af4e641d-d312-4f39-878b-fd7ddc3984df</entry>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <entry name="family">Virtual Machine</entry>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     </system>
Feb 17 17:31:20 compute-0 nova_compute[186479]:   </sysinfo>
Feb 17 17:31:20 compute-0 nova_compute[186479]:   <os>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <boot dev="hd"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <smbios mode="sysinfo"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:   </os>
Feb 17 17:31:20 compute-0 nova_compute[186479]:   <features>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <acpi/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <apic/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <vmcoreinfo/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:   </features>
Feb 17 17:31:20 compute-0 nova_compute[186479]:   <clock offset="utc">
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <timer name="pit" tickpolicy="delay"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <timer name="hpet" present="no"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:   </clock>
Feb 17 17:31:20 compute-0 nova_compute[186479]:   <cpu mode="host-model" match="exact">
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <topology sockets="1" cores="1" threads="1"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:31:20 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <disk type="file" device="disk">
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/af4e641d-d312-4f39-878b-fd7ddc3984df/disk"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <target dev="vda" bus="virtio"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <disk type="file" device="cdrom">
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <driver name="qemu" type="raw" cache="none"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/af4e641d-d312-4f39-878b-fd7ddc3984df/disk.config"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <target dev="sda" bus="sata"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <interface type="ethernet">
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <mac address="fa:16:3e:f2:86:09"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <driver name="vhost" rx_queue_size="512"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <mtu size="1442"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <target dev="tapc8432488-9d"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <serial type="pty">
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <log file="/var/lib/nova/instances/af4e641d-d312-4f39-878b-fd7ddc3984df/console.log" append="off"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     </serial>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <video>
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     </video>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <input type="tablet" bus="usb"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <rng model="virtio">
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <backend model="random">/dev/urandom</backend>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <controller type="usb" index="0"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     <memballoon model="virtio">
Feb 17 17:31:20 compute-0 nova_compute[186479]:       <stats period="10"/>
Feb 17 17:31:20 compute-0 nova_compute[186479]:     </memballoon>
Feb 17 17:31:20 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:31:20 compute-0 nova_compute[186479]: </domain>
Feb 17 17:31:20 compute-0 nova_compute[186479]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.434 186483 DEBUG nova.compute.manager [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Preparing to wait for external event network-vif-plugged-c8432488-9dc7-47f2-88da-abda34b62ca5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.435 186483 DEBUG oslo_concurrency.lockutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "af4e641d-d312-4f39-878b-fd7ddc3984df-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.435 186483 DEBUG oslo_concurrency.lockutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "af4e641d-d312-4f39-878b-fd7ddc3984df-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.435 186483 DEBUG oslo_concurrency.lockutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "af4e641d-d312-4f39-878b-fd7ddc3984df-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.436 186483 DEBUG nova.virt.libvirt.vif [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:31:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-972534764',display_name='tempest-TestNetworkBasicOps-server-972534764',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-972534764',id=4,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKif6TcgbUhwZNSHA/7Zqj2ckTW5gy38G2PEojWfF3Gkej5Cj+1R9oPgDqLWMFj0xHsnWUhwl7YVr/EP5RRCVJHFnJIkLEAALNr4nUjt1M4lQ7PTSk0Axc9skXgk3GMq2g==',key_name='tempest-TestNetworkBasicOps-1576131339',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-n1mkh1rf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:31:15Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=af4e641d-d312-4f39-878b-fd7ddc3984df,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c8432488-9dc7-47f2-88da-abda34b62ca5", "address": "fa:16:3e:f2:86:09", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8432488-9d", "ovs_interfaceid": "c8432488-9dc7-47f2-88da-abda34b62ca5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.437 186483 DEBUG nova.network.os_vif_util [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "c8432488-9dc7-47f2-88da-abda34b62ca5", "address": "fa:16:3e:f2:86:09", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8432488-9d", "ovs_interfaceid": "c8432488-9dc7-47f2-88da-abda34b62ca5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.437 186483 DEBUG nova.network.os_vif_util [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:86:09,bridge_name='br-int',has_traffic_filtering=True,id=c8432488-9dc7-47f2-88da-abda34b62ca5,network=Network(6ea18f88-05f8-477a-96b4-268feae14237),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8432488-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.437 186483 DEBUG os_vif [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:86:09,bridge_name='br-int',has_traffic_filtering=True,id=c8432488-9dc7-47f2-88da-abda34b62ca5,network=Network(6ea18f88-05f8-477a-96b4-268feae14237),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8432488-9d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.438 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.438 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.439 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.441 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.441 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc8432488-9d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.442 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc8432488-9d, col_values=(('external_ids', {'iface-id': 'c8432488-9dc7-47f2-88da-abda34b62ca5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f2:86:09', 'vm-uuid': 'af4e641d-d312-4f39-878b-fd7ddc3984df'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.443 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:20 compute-0 NetworkManager[56323]: <info>  [1771349480.4448] manager: (tapc8432488-9d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.445 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.449 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.450 186483 INFO os_vif [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:86:09,bridge_name='br-int',has_traffic_filtering=True,id=c8432488-9dc7-47f2-88da-abda34b62ca5,network=Network(6ea18f88-05f8-477a-96b4-268feae14237),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8432488-9d')
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.503 186483 DEBUG nova.virt.libvirt.driver [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.504 186483 DEBUG nova.virt.libvirt.driver [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.504 186483 DEBUG nova.virt.libvirt.driver [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No VIF found with MAC fa:16:3e:f2:86:09, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 17 17:31:20 compute-0 nova_compute[186479]: 2026-02-17 17:31:20.504 186483 INFO nova.virt.libvirt.driver [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Using config drive
Feb 17 17:31:21 compute-0 nova_compute[186479]: 2026-02-17 17:31:21.327 186483 INFO nova.virt.libvirt.driver [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Creating config drive at /var/lib/nova/instances/af4e641d-d312-4f39-878b-fd7ddc3984df/disk.config
Feb 17 17:31:21 compute-0 nova_compute[186479]: 2026-02-17 17:31:21.335 186483 DEBUG oslo_concurrency.processutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/af4e641d-d312-4f39-878b-fd7ddc3984df/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpcgqcqthk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:31:21 compute-0 nova_compute[186479]: 2026-02-17 17:31:21.460 186483 DEBUG oslo_concurrency.processutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/af4e641d-d312-4f39-878b-fd7ddc3984df/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpcgqcqthk" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:31:21 compute-0 kernel: tapc8432488-9d: entered promiscuous mode
Feb 17 17:31:21 compute-0 NetworkManager[56323]: <info>  [1771349481.5055] manager: (tapc8432488-9d): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Feb 17 17:31:21 compute-0 ovn_controller[96568]: 2026-02-17T17:31:21Z|00065|binding|INFO|Claiming lport c8432488-9dc7-47f2-88da-abda34b62ca5 for this chassis.
Feb 17 17:31:21 compute-0 ovn_controller[96568]: 2026-02-17T17:31:21Z|00066|binding|INFO|c8432488-9dc7-47f2-88da-abda34b62ca5: Claiming fa:16:3e:f2:86:09 10.100.0.4
Feb 17 17:31:21 compute-0 nova_compute[186479]: 2026-02-17 17:31:21.508 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:21 compute-0 nova_compute[186479]: 2026-02-17 17:31:21.511 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:21 compute-0 nova_compute[186479]: 2026-02-17 17:31:21.513 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:21 compute-0 nova_compute[186479]: 2026-02-17 17:31:21.516 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.524 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:86:09 10.100.0.4'], port_security=['fa:16:3e:f2:86:09 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'af4e641d-d312-4f39-878b-fd7ddc3984df', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ea18f88-05f8-477a-96b4-268feae14237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '823b55fc-0d55-46d3-955d-e46832564672', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d03eeed7-622b-4037-b9a8-17a167c6170b, chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=c8432488-9dc7-47f2-88da-abda34b62ca5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.525 105898 INFO neutron.agent.ovn.metadata.agent [-] Port c8432488-9dc7-47f2-88da-abda34b62ca5 in datapath 6ea18f88-05f8-477a-96b4-268feae14237 bound to our chassis
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.526 105898 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6ea18f88-05f8-477a-96b4-268feae14237
Feb 17 17:31:21 compute-0 ovn_controller[96568]: 2026-02-17T17:31:21Z|00067|binding|INFO|Setting lport c8432488-9dc7-47f2-88da-abda34b62ca5 ovn-installed in OVS
Feb 17 17:31:21 compute-0 ovn_controller[96568]: 2026-02-17T17:31:21Z|00068|binding|INFO|Setting lport c8432488-9dc7-47f2-88da-abda34b62ca5 up in Southbound
Feb 17 17:31:21 compute-0 systemd-machined[155877]: New machine qemu-4-instance-00000004.
Feb 17 17:31:21 compute-0 nova_compute[186479]: 2026-02-17 17:31:21.538 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.539 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[9f9c6c77-ae67-4b6b-bbfe-e2d5ef9a8c8a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.540 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6ea18f88-01 in ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.542 215208 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6ea18f88-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.542 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[61a5b02d-1534-42b0-9612-b5a7ec28bf84]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.543 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[c3465271-71f7-4739-83ca-df9896e9df2c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:31:21 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000004.
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.552 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[bc33bfb0-291f-44a2-9a28-eaea091b2997]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:31:21 compute-0 systemd-udevd[216741]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:31:21 compute-0 NetworkManager[56323]: <info>  [1771349481.5686] device (tapc8432488-9d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 17 17:31:21 compute-0 NetworkManager[56323]: <info>  [1771349481.5695] device (tapc8432488-9d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.578 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[14ae9e14-8a1c-461c-823e-01f1a46b287c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.601 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[bd21a6d6-fdbd-4048-92e8-c69fbbe78886]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.605 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[9cab5610-5bbd-4930-b5d9-fe04d74b24bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:31:21 compute-0 NetworkManager[56323]: <info>  [1771349481.6063] manager: (tap6ea18f88-00): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.632 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[9a3a5f32-5194-485f-9b10-cdb6c8fa0437]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.635 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[803ecf82-2992-4444-9c61-d91f825d6824]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:31:21 compute-0 NetworkManager[56323]: <info>  [1771349481.6537] device (tap6ea18f88-00): carrier: link connected
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.657 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[384533d0-e61b-46ea-9e9f-489d97837235]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.673 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[ddc0707f-96e7-45cb-92a4-728c0ecf26b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6ea18f88-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:4d:e5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 323696, 'reachable_time': 29795, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216772, 'error': None, 'target': 'ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.687 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[27c1a537-602d-4d3e-b8d1-cdabd64560fb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4d:4de5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 323696, 'tstamp': 323696}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216773, 'error': None, 'target': 'ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.709 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[485cc751-7704-4186-af9a-cfde1ea00b19]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6ea18f88-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:4d:e5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 323696, 'reachable_time': 29795, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216774, 'error': None, 'target': 'ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.743 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[61a21368-d6c1-4980-8f40-bf6798926d69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:31:21 compute-0 nova_compute[186479]: 2026-02-17 17:31:21.745 186483 DEBUG nova.compute.manager [req-abaa7971-3ac9-4845-84ff-acb204e85fc8 req-b7fce543-5d54-4d8d-b055-79881f5fb920 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Received event network-vif-plugged-c8432488-9dc7-47f2-88da-abda34b62ca5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:31:21 compute-0 nova_compute[186479]: 2026-02-17 17:31:21.746 186483 DEBUG oslo_concurrency.lockutils [req-abaa7971-3ac9-4845-84ff-acb204e85fc8 req-b7fce543-5d54-4d8d-b055-79881f5fb920 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "af4e641d-d312-4f39-878b-fd7ddc3984df-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:31:21 compute-0 nova_compute[186479]: 2026-02-17 17:31:21.746 186483 DEBUG oslo_concurrency.lockutils [req-abaa7971-3ac9-4845-84ff-acb204e85fc8 req-b7fce543-5d54-4d8d-b055-79881f5fb920 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "af4e641d-d312-4f39-878b-fd7ddc3984df-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:31:21 compute-0 nova_compute[186479]: 2026-02-17 17:31:21.747 186483 DEBUG oslo_concurrency.lockutils [req-abaa7971-3ac9-4845-84ff-acb204e85fc8 req-b7fce543-5d54-4d8d-b055-79881f5fb920 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "af4e641d-d312-4f39-878b-fd7ddc3984df-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:31:21 compute-0 nova_compute[186479]: 2026-02-17 17:31:21.747 186483 DEBUG nova.compute.manager [req-abaa7971-3ac9-4845-84ff-acb204e85fc8 req-b7fce543-5d54-4d8d-b055-79881f5fb920 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Processing event network-vif-plugged-c8432488-9dc7-47f2-88da-abda34b62ca5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.791 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[bf06a7cb-fc65-461c-a1bc-39c08c53b0cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.792 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6ea18f88-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.793 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.793 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6ea18f88-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:31:21 compute-0 NetworkManager[56323]: <info>  [1771349481.7961] manager: (tap6ea18f88-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Feb 17 17:31:21 compute-0 nova_compute[186479]: 2026-02-17 17:31:21.796 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:21 compute-0 kernel: tap6ea18f88-00: entered promiscuous mode
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.798 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6ea18f88-00, col_values=(('external_ids', {'iface-id': 'e24d2411-f76f-4a15-8650-ce90df03012a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:31:21 compute-0 ovn_controller[96568]: 2026-02-17T17:31:21Z|00069|binding|INFO|Releasing lport e24d2411-f76f-4a15-8650-ce90df03012a from this chassis (sb_readonly=0)
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.800 105898 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6ea18f88-05f8-477a-96b4-268feae14237.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6ea18f88-05f8-477a-96b4-268feae14237.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.801 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[2436519c-a2d8-4c50-9ae5-71e1137b1353]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.801 105898 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: global
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:     log         /dev/log local0 debug
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:     log-tag     haproxy-metadata-proxy-6ea18f88-05f8-477a-96b4-268feae14237
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:     user        root
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:     group       root
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:     maxconn     1024
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:     pidfile     /var/lib/neutron/external/pids/6ea18f88-05f8-477a-96b4-268feae14237.pid.haproxy
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:     daemon
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: defaults
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:     log global
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:     mode http
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:     option httplog
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:     option dontlognull
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:     option http-server-close
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:     option forwardfor
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:     retries                 3
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:     timeout http-request    30s
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:     timeout connect         30s
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:     timeout client          32s
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:     timeout server          32s
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:     timeout http-keep-alive 30s
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: listen listener
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:     bind 169.254.169.254:80
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:     server metadata /var/lib/neutron/metadata_proxy
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:     http-request add-header X-OVN-Network-ID 6ea18f88-05f8-477a-96b4-268feae14237
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 17 17:31:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:21.802 105898 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237', 'env', 'PROCESS_TAG=haproxy-6ea18f88-05f8-477a-96b4-268feae14237', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6ea18f88-05f8-477a-96b4-268feae14237.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 17 17:31:21 compute-0 nova_compute[186479]: 2026-02-17 17:31:21.802 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:21 compute-0 nova_compute[186479]: 2026-02-17 17:31:21.821 186483 DEBUG nova.network.neutron [req-797d3b7a-0334-4e04-98be-3b8192abb7c5 req-a3218976-bee6-4288-8be3-d848bd703e20 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Updated VIF entry in instance network info cache for port c8432488-9dc7-47f2-88da-abda34b62ca5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:31:21 compute-0 nova_compute[186479]: 2026-02-17 17:31:21.821 186483 DEBUG nova.network.neutron [req-797d3b7a-0334-4e04-98be-3b8192abb7c5 req-a3218976-bee6-4288-8be3-d848bd703e20 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Updating instance_info_cache with network_info: [{"id": "c8432488-9dc7-47f2-88da-abda34b62ca5", "address": "fa:16:3e:f2:86:09", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8432488-9d", "ovs_interfaceid": "c8432488-9dc7-47f2-88da-abda34b62ca5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:31:21 compute-0 nova_compute[186479]: 2026-02-17 17:31:21.838 186483 DEBUG oslo_concurrency.lockutils [req-797d3b7a-0334-4e04-98be-3b8192abb7c5 req-a3218976-bee6-4288-8be3-d848bd703e20 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-af4e641d-d312-4f39-878b-fd7ddc3984df" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.005 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349482.0047915, af4e641d-d312-4f39-878b-fd7ddc3984df => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.005 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] VM Started (Lifecycle Event)
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.008 186483 DEBUG nova.compute.manager [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.011 186483 DEBUG nova.virt.libvirt.driver [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.015 186483 INFO nova.virt.libvirt.driver [-] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Instance spawned successfully.
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.016 186483 DEBUG nova.virt.libvirt.driver [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.034 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.042 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.045 186483 DEBUG nova.virt.libvirt.driver [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.046 186483 DEBUG nova.virt.libvirt.driver [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.046 186483 DEBUG nova.virt.libvirt.driver [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.046 186483 DEBUG nova.virt.libvirt.driver [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.047 186483 DEBUG nova.virt.libvirt.driver [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.047 186483 DEBUG nova.virt.libvirt.driver [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.077 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.078 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349482.0057535, af4e641d-d312-4f39-878b-fd7ddc3984df => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.078 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] VM Paused (Lifecycle Event)
Feb 17 17:31:22 compute-0 podman[216813]: 2026-02-17 17:31:22.11023035 +0000 UTC m=+0.049582051 container create 9397233f1daf73350023369226a7720ba4dbd2fc568166cfbd9f81c414ffc285 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.114 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.120 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349482.0112095, af4e641d-d312-4f39-878b-fd7ddc3984df => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.121 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] VM Resumed (Lifecycle Event)
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.126 186483 INFO nova.compute.manager [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Took 6.61 seconds to spawn the instance on the hypervisor.
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.127 186483 DEBUG nova.compute.manager [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.148 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:31:22 compute-0 systemd[1]: Started libpod-conmon-9397233f1daf73350023369226a7720ba4dbd2fc568166cfbd9f81c414ffc285.scope.
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.151 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:31:22 compute-0 systemd[1]: Started libcrun container.
Feb 17 17:31:22 compute-0 podman[216813]: 2026-02-17 17:31:22.080246001 +0000 UTC m=+0.019597722 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 17 17:31:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b44392ec6dad5b38bce18fbf931aa14fa64d0ecfd8c7a39abd0f23573b1bcd7b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.183 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:31:22 compute-0 podman[216813]: 2026-02-17 17:31:22.189272707 +0000 UTC m=+0.128624428 container init 9397233f1daf73350023369226a7720ba4dbd2fc568166cfbd9f81c414ffc285 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 17 17:31:22 compute-0 podman[216813]: 2026-02-17 17:31:22.198185881 +0000 UTC m=+0.137537582 container start 9397233f1daf73350023369226a7720ba4dbd2fc568166cfbd9f81c414ffc285 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.206 186483 INFO nova.compute.manager [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Took 7.07 seconds to build instance.
Feb 17 17:31:22 compute-0 podman[216826]: 2026-02-17 17:31:22.211269265 +0000 UTC m=+0.062113292 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 17 17:31:22 compute-0 podman[216829]: 2026-02-17 17:31:22.220753882 +0000 UTC m=+0.067257445 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 17 17:31:22 compute-0 nova_compute[186479]: 2026-02-17 17:31:22.221 186483 DEBUG oslo_concurrency.lockutils [None req-97fce25c-fc9e-4fbd-8fe1-7345a10b2ba7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "af4e641d-d312-4f39-878b-fd7ddc3984df" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:31:22 compute-0 neutron-haproxy-ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237[216835]: [NOTICE]   (216867) : New worker (216870) forked
Feb 17 17:31:22 compute-0 neutron-haproxy-ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237[216835]: [NOTICE]   (216867) : Loading success.
Feb 17 17:31:23 compute-0 nova_compute[186479]: 2026-02-17 17:31:23.481 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:23 compute-0 nova_compute[186479]: 2026-02-17 17:31:23.894 186483 DEBUG nova.compute.manager [req-eb4b7a8e-7bf4-435a-8af0-72ca76eda976 req-48701589-5498-4a93-a0d5-ec9064be786b 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Received event network-vif-plugged-c8432488-9dc7-47f2-88da-abda34b62ca5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:31:23 compute-0 nova_compute[186479]: 2026-02-17 17:31:23.895 186483 DEBUG oslo_concurrency.lockutils [req-eb4b7a8e-7bf4-435a-8af0-72ca76eda976 req-48701589-5498-4a93-a0d5-ec9064be786b 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "af4e641d-d312-4f39-878b-fd7ddc3984df-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:31:23 compute-0 nova_compute[186479]: 2026-02-17 17:31:23.896 186483 DEBUG oslo_concurrency.lockutils [req-eb4b7a8e-7bf4-435a-8af0-72ca76eda976 req-48701589-5498-4a93-a0d5-ec9064be786b 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "af4e641d-d312-4f39-878b-fd7ddc3984df-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:31:23 compute-0 nova_compute[186479]: 2026-02-17 17:31:23.896 186483 DEBUG oslo_concurrency.lockutils [req-eb4b7a8e-7bf4-435a-8af0-72ca76eda976 req-48701589-5498-4a93-a0d5-ec9064be786b 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "af4e641d-d312-4f39-878b-fd7ddc3984df-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:31:23 compute-0 nova_compute[186479]: 2026-02-17 17:31:23.897 186483 DEBUG nova.compute.manager [req-eb4b7a8e-7bf4-435a-8af0-72ca76eda976 req-48701589-5498-4a93-a0d5-ec9064be786b 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] No waiting events found dispatching network-vif-plugged-c8432488-9dc7-47f2-88da-abda34b62ca5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:31:23 compute-0 nova_compute[186479]: 2026-02-17 17:31:23.897 186483 WARNING nova.compute.manager [req-eb4b7a8e-7bf4-435a-8af0-72ca76eda976 req-48701589-5498-4a93-a0d5-ec9064be786b 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Received unexpected event network-vif-plugged-c8432488-9dc7-47f2-88da-abda34b62ca5 for instance with vm_state active and task_state None.
Feb 17 17:31:25 compute-0 nova_compute[186479]: 2026-02-17 17:31:25.444 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:25 compute-0 nova_compute[186479]: 2026-02-17 17:31:25.608 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:25 compute-0 NetworkManager[56323]: <info>  [1771349485.6097] manager: (patch-br-int-to-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Feb 17 17:31:25 compute-0 NetworkManager[56323]: <info>  [1771349485.6115] manager: (patch-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Feb 17 17:31:25 compute-0 ovn_controller[96568]: 2026-02-17T17:31:25Z|00070|binding|INFO|Releasing lport e24d2411-f76f-4a15-8650-ce90df03012a from this chassis (sb_readonly=0)
Feb 17 17:31:25 compute-0 ovn_controller[96568]: 2026-02-17T17:31:25Z|00071|binding|INFO|Releasing lport e24d2411-f76f-4a15-8650-ce90df03012a from this chassis (sb_readonly=0)
Feb 17 17:31:25 compute-0 nova_compute[186479]: 2026-02-17 17:31:25.615 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:26 compute-0 nova_compute[186479]: 2026-02-17 17:31:26.007 186483 DEBUG nova.compute.manager [req-a832a781-960c-4a90-a89c-6d5461ae5b90 req-a5e86adb-439e-479a-9361-f7c1f6acaf88 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Received event network-changed-c8432488-9dc7-47f2-88da-abda34b62ca5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:31:26 compute-0 nova_compute[186479]: 2026-02-17 17:31:26.008 186483 DEBUG nova.compute.manager [req-a832a781-960c-4a90-a89c-6d5461ae5b90 req-a5e86adb-439e-479a-9361-f7c1f6acaf88 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Refreshing instance network info cache due to event network-changed-c8432488-9dc7-47f2-88da-abda34b62ca5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:31:26 compute-0 nova_compute[186479]: 2026-02-17 17:31:26.008 186483 DEBUG oslo_concurrency.lockutils [req-a832a781-960c-4a90-a89c-6d5461ae5b90 req-a5e86adb-439e-479a-9361-f7c1f6acaf88 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-af4e641d-d312-4f39-878b-fd7ddc3984df" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:31:26 compute-0 nova_compute[186479]: 2026-02-17 17:31:26.008 186483 DEBUG oslo_concurrency.lockutils [req-a832a781-960c-4a90-a89c-6d5461ae5b90 req-a5e86adb-439e-479a-9361-f7c1f6acaf88 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-af4e641d-d312-4f39-878b-fd7ddc3984df" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:31:26 compute-0 nova_compute[186479]: 2026-02-17 17:31:26.009 186483 DEBUG nova.network.neutron [req-a832a781-960c-4a90-a89c-6d5461ae5b90 req-a5e86adb-439e-479a-9361-f7c1f6acaf88 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Refreshing network info cache for port c8432488-9dc7-47f2-88da-abda34b62ca5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:31:26 compute-0 podman[216881]: 2026-02-17 17:31:26.707045501 +0000 UTC m=+0.053484324 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 17 17:31:27 compute-0 nova_compute[186479]: 2026-02-17 17:31:27.439 186483 DEBUG nova.network.neutron [req-a832a781-960c-4a90-a89c-6d5461ae5b90 req-a5e86adb-439e-479a-9361-f7c1f6acaf88 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Updated VIF entry in instance network info cache for port c8432488-9dc7-47f2-88da-abda34b62ca5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:31:27 compute-0 nova_compute[186479]: 2026-02-17 17:31:27.441 186483 DEBUG nova.network.neutron [req-a832a781-960c-4a90-a89c-6d5461ae5b90 req-a5e86adb-439e-479a-9361-f7c1f6acaf88 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Updating instance_info_cache with network_info: [{"id": "c8432488-9dc7-47f2-88da-abda34b62ca5", "address": "fa:16:3e:f2:86:09", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8432488-9d", "ovs_interfaceid": "c8432488-9dc7-47f2-88da-abda34b62ca5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:31:27 compute-0 nova_compute[186479]: 2026-02-17 17:31:27.458 186483 DEBUG oslo_concurrency.lockutils [req-a832a781-960c-4a90-a89c-6d5461ae5b90 req-a5e86adb-439e-479a-9361-f7c1f6acaf88 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-af4e641d-d312-4f39-878b-fd7ddc3984df" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:31:28 compute-0 nova_compute[186479]: 2026-02-17 17:31:28.484 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:30 compute-0 nova_compute[186479]: 2026-02-17 17:31:30.447 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:33 compute-0 nova_compute[186479]: 2026-02-17 17:31:33.486 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:33 compute-0 ovn_controller[96568]: 2026-02-17T17:31:33Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f2:86:09 10.100.0.4
Feb 17 17:31:33 compute-0 ovn_controller[96568]: 2026-02-17T17:31:33Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f2:86:09 10.100.0.4
Feb 17 17:31:35 compute-0 nova_compute[186479]: 2026-02-17 17:31:35.451 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:35 compute-0 podman[216925]: 2026-02-17 17:31:35.748029902 +0000 UTC m=+0.090614184 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller)
Feb 17 17:31:38 compute-0 nova_compute[186479]: 2026-02-17 17:31:38.489 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:38 compute-0 podman[216952]: 2026-02-17 17:31:38.744646121 +0000 UTC m=+0.080027604 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 17 17:31:40 compute-0 nova_compute[186479]: 2026-02-17 17:31:40.345 186483 INFO nova.compute.manager [None req-a1371724-8111-42e4-b7db-24992ab25bce 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Get console output
Feb 17 17:31:40 compute-0 nova_compute[186479]: 2026-02-17 17:31:40.352 215112 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 17 17:31:40 compute-0 nova_compute[186479]: 2026-02-17 17:31:40.455 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:43 compute-0 nova_compute[186479]: 2026-02-17 17:31:43.099 186483 DEBUG nova.compute.manager [req-b3300b7e-8de9-41eb-b9cb-fd204ab42a6d req-694bdf92-0bf6-480f-9c85-6beb47aa68f1 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Received event network-changed-c8432488-9dc7-47f2-88da-abda34b62ca5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:31:43 compute-0 nova_compute[186479]: 2026-02-17 17:31:43.100 186483 DEBUG nova.compute.manager [req-b3300b7e-8de9-41eb-b9cb-fd204ab42a6d req-694bdf92-0bf6-480f-9c85-6beb47aa68f1 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Refreshing instance network info cache due to event network-changed-c8432488-9dc7-47f2-88da-abda34b62ca5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:31:43 compute-0 nova_compute[186479]: 2026-02-17 17:31:43.100 186483 DEBUG oslo_concurrency.lockutils [req-b3300b7e-8de9-41eb-b9cb-fd204ab42a6d req-694bdf92-0bf6-480f-9c85-6beb47aa68f1 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-af4e641d-d312-4f39-878b-fd7ddc3984df" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:31:43 compute-0 nova_compute[186479]: 2026-02-17 17:31:43.101 186483 DEBUG oslo_concurrency.lockutils [req-b3300b7e-8de9-41eb-b9cb-fd204ab42a6d req-694bdf92-0bf6-480f-9c85-6beb47aa68f1 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-af4e641d-d312-4f39-878b-fd7ddc3984df" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:31:43 compute-0 nova_compute[186479]: 2026-02-17 17:31:43.101 186483 DEBUG nova.network.neutron [req-b3300b7e-8de9-41eb-b9cb-fd204ab42a6d req-694bdf92-0bf6-480f-9c85-6beb47aa68f1 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Refreshing network info cache for port c8432488-9dc7-47f2-88da-abda34b62ca5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:31:43 compute-0 nova_compute[186479]: 2026-02-17 17:31:43.529 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:44 compute-0 nova_compute[186479]: 2026-02-17 17:31:44.219 186483 DEBUG nova.network.neutron [req-b3300b7e-8de9-41eb-b9cb-fd204ab42a6d req-694bdf92-0bf6-480f-9c85-6beb47aa68f1 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Updated VIF entry in instance network info cache for port c8432488-9dc7-47f2-88da-abda34b62ca5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:31:44 compute-0 nova_compute[186479]: 2026-02-17 17:31:44.219 186483 DEBUG nova.network.neutron [req-b3300b7e-8de9-41eb-b9cb-fd204ab42a6d req-694bdf92-0bf6-480f-9c85-6beb47aa68f1 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Updating instance_info_cache with network_info: [{"id": "c8432488-9dc7-47f2-88da-abda34b62ca5", "address": "fa:16:3e:f2:86:09", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8432488-9d", "ovs_interfaceid": "c8432488-9dc7-47f2-88da-abda34b62ca5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:31:44 compute-0 nova_compute[186479]: 2026-02-17 17:31:44.237 186483 DEBUG oslo_concurrency.lockutils [req-b3300b7e-8de9-41eb-b9cb-fd204ab42a6d req-694bdf92-0bf6-480f-9c85-6beb47aa68f1 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-af4e641d-d312-4f39-878b-fd7ddc3984df" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:31:44 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:44.408 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '7a:bf:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:a1:c8:7d:f2:63'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:31:44 compute-0 nova_compute[186479]: 2026-02-17 17:31:44.408 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:44 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:44.410 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 17 17:31:44 compute-0 podman[216977]: 2026-02-17 17:31:44.72540786 +0000 UTC m=+0.061230242 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, version=9.7, config_id=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 17 17:31:45 compute-0 nova_compute[186479]: 2026-02-17 17:31:45.458 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:48 compute-0 nova_compute[186479]: 2026-02-17 17:31:48.532 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:50 compute-0 nova_compute[186479]: 2026-02-17 17:31:50.462 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:50 compute-0 nova_compute[186479]: 2026-02-17 17:31:50.615 186483 DEBUG oslo_concurrency.lockutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "9d30324a-c03a-43f6-a245-3ac4f0923693" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:31:50 compute-0 nova_compute[186479]: 2026-02-17 17:31:50.615 186483 DEBUG oslo_concurrency.lockutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "9d30324a-c03a-43f6-a245-3ac4f0923693" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:31:50 compute-0 nova_compute[186479]: 2026-02-17 17:31:50.632 186483 DEBUG nova.compute.manager [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 17 17:31:50 compute-0 nova_compute[186479]: 2026-02-17 17:31:50.709 186483 DEBUG oslo_concurrency.lockutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:31:50 compute-0 nova_compute[186479]: 2026-02-17 17:31:50.710 186483 DEBUG oslo_concurrency.lockutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:31:50 compute-0 nova_compute[186479]: 2026-02-17 17:31:50.724 186483 DEBUG nova.virt.hardware [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 17 17:31:50 compute-0 nova_compute[186479]: 2026-02-17 17:31:50.724 186483 INFO nova.compute.claims [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Claim successful on node compute-0.ctlplane.example.com
Feb 17 17:31:50 compute-0 nova_compute[186479]: 2026-02-17 17:31:50.839 186483 DEBUG nova.compute.provider_tree [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:31:50 compute-0 nova_compute[186479]: 2026-02-17 17:31:50.857 186483 DEBUG nova.scheduler.client.report [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:31:50 compute-0 nova_compute[186479]: 2026-02-17 17:31:50.879 186483 DEBUG oslo_concurrency.lockutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:31:50 compute-0 nova_compute[186479]: 2026-02-17 17:31:50.880 186483 DEBUG nova.compute.manager [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 17 17:31:50 compute-0 nova_compute[186479]: 2026-02-17 17:31:50.932 186483 DEBUG nova.compute.manager [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 17 17:31:50 compute-0 nova_compute[186479]: 2026-02-17 17:31:50.933 186483 DEBUG nova.network.neutron [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 17 17:31:50 compute-0 nova_compute[186479]: 2026-02-17 17:31:50.959 186483 INFO nova.virt.libvirt.driver [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 17 17:31:50 compute-0 nova_compute[186479]: 2026-02-17 17:31:50.978 186483 DEBUG nova.compute.manager [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.074 186483 DEBUG nova.compute.manager [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.076 186483 DEBUG nova.virt.libvirt.driver [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.076 186483 INFO nova.virt.libvirt.driver [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Creating image(s)
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.077 186483 DEBUG oslo_concurrency.lockutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "/var/lib/nova/instances/9d30324a-c03a-43f6-a245-3ac4f0923693/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.078 186483 DEBUG oslo_concurrency.lockutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/9d30324a-c03a-43f6-a245-3ac4f0923693/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.079 186483 DEBUG oslo_concurrency.lockutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/9d30324a-c03a-43f6-a245-3ac4f0923693/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.102 186483 DEBUG oslo_concurrency.processutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.165 186483 DEBUG oslo_concurrency.processutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.166 186483 DEBUG oslo_concurrency.lockutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.167 186483 DEBUG oslo_concurrency.lockutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.183 186483 DEBUG oslo_concurrency.processutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.228 186483 DEBUG oslo_concurrency.processutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.230 186483 DEBUG oslo_concurrency.processutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/9d30324a-c03a-43f6-a245-3ac4f0923693/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.258 186483 DEBUG oslo_concurrency.processutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/9d30324a-c03a-43f6-a245-3ac4f0923693/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.259 186483 DEBUG oslo_concurrency.lockutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.260 186483 DEBUG oslo_concurrency.processutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.304 186483 DEBUG oslo_concurrency.processutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.305 186483 DEBUG nova.virt.disk.api [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Checking if we can resize image /var/lib/nova/instances/9d30324a-c03a-43f6-a245-3ac4f0923693/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.305 186483 DEBUG oslo_concurrency.processutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d30324a-c03a-43f6-a245-3ac4f0923693/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.346 186483 DEBUG nova.policy [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.350 186483 DEBUG oslo_concurrency.processutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d30324a-c03a-43f6-a245-3ac4f0923693/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.350 186483 DEBUG nova.virt.disk.api [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Cannot resize image /var/lib/nova/instances/9d30324a-c03a-43f6-a245-3ac4f0923693/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.351 186483 DEBUG nova.objects.instance [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'migration_context' on Instance uuid 9d30324a-c03a-43f6-a245-3ac4f0923693 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.379 186483 DEBUG nova.virt.libvirt.driver [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.379 186483 DEBUG nova.virt.libvirt.driver [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Ensure instance console log exists: /var/lib/nova/instances/9d30324a-c03a-43f6-a245-3ac4f0923693/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.380 186483 DEBUG oslo_concurrency.lockutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.381 186483 DEBUG oslo_concurrency.lockutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.381 186483 DEBUG oslo_concurrency.lockutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:31:51 compute-0 nova_compute[186479]: 2026-02-17 17:31:51.942 186483 DEBUG nova.network.neutron [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Successfully created port: df8be582-0d74-458c-bb58-167d804479f1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 17 17:31:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:52.413 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ee0cee2f-3200-4f1f-8903-57b18789347d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:31:52 compute-0 podman[217013]: 2026-02-17 17:31:52.7381624 +0000 UTC m=+0.075659477 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 17 17:31:52 compute-0 podman[217014]: 2026-02-17 17:31:52.755164597 +0000 UTC m=+0.087416395 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 17 17:31:53 compute-0 nova_compute[186479]: 2026-02-17 17:31:53.325 186483 DEBUG nova.network.neutron [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Successfully updated port: df8be582-0d74-458c-bb58-167d804479f1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 17 17:31:53 compute-0 nova_compute[186479]: 2026-02-17 17:31:53.343 186483 DEBUG oslo_concurrency.lockutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "refresh_cache-9d30324a-c03a-43f6-a245-3ac4f0923693" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:31:53 compute-0 nova_compute[186479]: 2026-02-17 17:31:53.343 186483 DEBUG oslo_concurrency.lockutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquired lock "refresh_cache-9d30324a-c03a-43f6-a245-3ac4f0923693" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:31:53 compute-0 nova_compute[186479]: 2026-02-17 17:31:53.344 186483 DEBUG nova.network.neutron [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 17 17:31:53 compute-0 nova_compute[186479]: 2026-02-17 17:31:53.403 186483 DEBUG nova.compute.manager [req-b242d726-13a8-4224-8676-4e4639a916cf req-2d32fbfd-8f8b-49f5-9c27-410a271a13af 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Received event network-changed-df8be582-0d74-458c-bb58-167d804479f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:31:53 compute-0 nova_compute[186479]: 2026-02-17 17:31:53.404 186483 DEBUG nova.compute.manager [req-b242d726-13a8-4224-8676-4e4639a916cf req-2d32fbfd-8f8b-49f5-9c27-410a271a13af 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Refreshing instance network info cache due to event network-changed-df8be582-0d74-458c-bb58-167d804479f1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:31:53 compute-0 nova_compute[186479]: 2026-02-17 17:31:53.404 186483 DEBUG oslo_concurrency.lockutils [req-b242d726-13a8-4224-8676-4e4639a916cf req-2d32fbfd-8f8b-49f5-9c27-410a271a13af 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-9d30324a-c03a-43f6-a245-3ac4f0923693" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:31:53 compute-0 nova_compute[186479]: 2026-02-17 17:31:53.491 186483 DEBUG nova.network.neutron [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 17 17:31:53 compute-0 nova_compute[186479]: 2026-02-17 17:31:53.554 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.324 186483 DEBUG nova.network.neutron [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Updating instance_info_cache with network_info: [{"id": "df8be582-0d74-458c-bb58-167d804479f1", "address": "fa:16:3e:bf:64:da", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf8be582-0d", "ovs_interfaceid": "df8be582-0d74-458c-bb58-167d804479f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.344 186483 DEBUG oslo_concurrency.lockutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Releasing lock "refresh_cache-9d30324a-c03a-43f6-a245-3ac4f0923693" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.345 186483 DEBUG nova.compute.manager [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Instance network_info: |[{"id": "df8be582-0d74-458c-bb58-167d804479f1", "address": "fa:16:3e:bf:64:da", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf8be582-0d", "ovs_interfaceid": "df8be582-0d74-458c-bb58-167d804479f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.346 186483 DEBUG oslo_concurrency.lockutils [req-b242d726-13a8-4224-8676-4e4639a916cf req-2d32fbfd-8f8b-49f5-9c27-410a271a13af 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-9d30324a-c03a-43f6-a245-3ac4f0923693" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.346 186483 DEBUG nova.network.neutron [req-b242d726-13a8-4224-8676-4e4639a916cf req-2d32fbfd-8f8b-49f5-9c27-410a271a13af 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Refreshing network info cache for port df8be582-0d74-458c-bb58-167d804479f1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.351 186483 DEBUG nova.virt.libvirt.driver [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Start _get_guest_xml network_info=[{"id": "df8be582-0d74-458c-bb58-167d804479f1", "address": "fa:16:3e:bf:64:da", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf8be582-0d", "ovs_interfaceid": "df8be582-0d74-458c-bb58-167d804479f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_format': None, 'encryption_options': None, 'size': 0, 'encrypted': False, 'image_id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.357 186483 WARNING nova.virt.libvirt.driver [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.363 186483 DEBUG nova.virt.libvirt.host [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.364 186483 DEBUG nova.virt.libvirt.host [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.373 186483 DEBUG nova.virt.libvirt.host [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.374 186483 DEBUG nova.virt.libvirt.host [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.374 186483 DEBUG nova.virt.libvirt.driver [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.375 186483 DEBUG nova.virt.hardware [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-17T17:28:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='5ebd41ec-8360-4181-bce3-0c0dc586cdb2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.375 186483 DEBUG nova.virt.hardware [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.376 186483 DEBUG nova.virt.hardware [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.376 186483 DEBUG nova.virt.hardware [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.377 186483 DEBUG nova.virt.hardware [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.377 186483 DEBUG nova.virt.hardware [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.377 186483 DEBUG nova.virt.hardware [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.378 186483 DEBUG nova.virt.hardware [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.378 186483 DEBUG nova.virt.hardware [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.379 186483 DEBUG nova.virt.hardware [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.379 186483 DEBUG nova.virt.hardware [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.385 186483 DEBUG nova.virt.libvirt.vif [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:31:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-570000635',display_name='tempest-TestNetworkBasicOps-server-570000635',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-570000635',id=5,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGq2gHBT44SLjiFzk5lM6oqU2OGb0TC/GXH2UJFBIIhXQa3lvYj945Uq65Ff28vqy0DJ2ucPh3a1eBAM0ZIFsF8QTss6Z3Z0k/AY0USQCSIApBIdTevxLskGIJlWXYA/Qw==',key_name='tempest-TestNetworkBasicOps-1876004402',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-9k0p090f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:31:51Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=9d30324a-c03a-43f6-a245-3ac4f0923693,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df8be582-0d74-458c-bb58-167d804479f1", "address": "fa:16:3e:bf:64:da", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf8be582-0d", "ovs_interfaceid": "df8be582-0d74-458c-bb58-167d804479f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.385 186483 DEBUG nova.network.os_vif_util [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "df8be582-0d74-458c-bb58-167d804479f1", "address": "fa:16:3e:bf:64:da", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf8be582-0d", "ovs_interfaceid": "df8be582-0d74-458c-bb58-167d804479f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.386 186483 DEBUG nova.network.os_vif_util [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:64:da,bridge_name='br-int',has_traffic_filtering=True,id=df8be582-0d74-458c-bb58-167d804479f1,network=Network(6ea18f88-05f8-477a-96b4-268feae14237),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf8be582-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.388 186483 DEBUG nova.objects.instance [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'pci_devices' on Instance uuid 9d30324a-c03a-43f6-a245-3ac4f0923693 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.403 186483 DEBUG nova.virt.libvirt.driver [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] End _get_guest_xml xml=<domain type="kvm">
Feb 17 17:31:54 compute-0 nova_compute[186479]:   <uuid>9d30324a-c03a-43f6-a245-3ac4f0923693</uuid>
Feb 17 17:31:54 compute-0 nova_compute[186479]:   <name>instance-00000005</name>
Feb 17 17:31:54 compute-0 nova_compute[186479]:   <memory>131072</memory>
Feb 17 17:31:54 compute-0 nova_compute[186479]:   <vcpu>1</vcpu>
Feb 17 17:31:54 compute-0 nova_compute[186479]:   <metadata>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <nova:name>tempest-TestNetworkBasicOps-server-570000635</nova:name>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <nova:creationTime>2026-02-17 17:31:54</nova:creationTime>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <nova:flavor name="m1.nano">
Feb 17 17:31:54 compute-0 nova_compute[186479]:         <nova:memory>128</nova:memory>
Feb 17 17:31:54 compute-0 nova_compute[186479]:         <nova:disk>1</nova:disk>
Feb 17 17:31:54 compute-0 nova_compute[186479]:         <nova:swap>0</nova:swap>
Feb 17 17:31:54 compute-0 nova_compute[186479]:         <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:31:54 compute-0 nova_compute[186479]:         <nova:vcpus>1</nova:vcpus>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       </nova:flavor>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <nova:owner>
Feb 17 17:31:54 compute-0 nova_compute[186479]:         <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:31:54 compute-0 nova_compute[186479]:         <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       </nova:owner>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <nova:ports>
Feb 17 17:31:54 compute-0 nova_compute[186479]:         <nova:port uuid="df8be582-0d74-458c-bb58-167d804479f1">
Feb 17 17:31:54 compute-0 nova_compute[186479]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:         </nova:port>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       </nova:ports>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     </nova:instance>
Feb 17 17:31:54 compute-0 nova_compute[186479]:   </metadata>
Feb 17 17:31:54 compute-0 nova_compute[186479]:   <sysinfo type="smbios">
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <system>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <entry name="manufacturer">RDO</entry>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <entry name="product">OpenStack Compute</entry>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <entry name="serial">9d30324a-c03a-43f6-a245-3ac4f0923693</entry>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <entry name="uuid">9d30324a-c03a-43f6-a245-3ac4f0923693</entry>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <entry name="family">Virtual Machine</entry>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     </system>
Feb 17 17:31:54 compute-0 nova_compute[186479]:   </sysinfo>
Feb 17 17:31:54 compute-0 nova_compute[186479]:   <os>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <boot dev="hd"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <smbios mode="sysinfo"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:   </os>
Feb 17 17:31:54 compute-0 nova_compute[186479]:   <features>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <acpi/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <apic/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <vmcoreinfo/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:   </features>
Feb 17 17:31:54 compute-0 nova_compute[186479]:   <clock offset="utc">
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <timer name="pit" tickpolicy="delay"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <timer name="hpet" present="no"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:   </clock>
Feb 17 17:31:54 compute-0 nova_compute[186479]:   <cpu mode="host-model" match="exact">
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <topology sockets="1" cores="1" threads="1"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:31:54 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <disk type="file" device="disk">
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/9d30324a-c03a-43f6-a245-3ac4f0923693/disk"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <target dev="vda" bus="virtio"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <disk type="file" device="cdrom">
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <driver name="qemu" type="raw" cache="none"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/9d30324a-c03a-43f6-a245-3ac4f0923693/disk.config"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <target dev="sda" bus="sata"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <interface type="ethernet">
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <mac address="fa:16:3e:bf:64:da"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <driver name="vhost" rx_queue_size="512"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <mtu size="1442"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <target dev="tapdf8be582-0d"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <serial type="pty">
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <log file="/var/lib/nova/instances/9d30324a-c03a-43f6-a245-3ac4f0923693/console.log" append="off"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     </serial>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <video>
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     </video>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <input type="tablet" bus="usb"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <rng model="virtio">
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <backend model="random">/dev/urandom</backend>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <controller type="usb" index="0"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     <memballoon model="virtio">
Feb 17 17:31:54 compute-0 nova_compute[186479]:       <stats period="10"/>
Feb 17 17:31:54 compute-0 nova_compute[186479]:     </memballoon>
Feb 17 17:31:54 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:31:54 compute-0 nova_compute[186479]: </domain>
Feb 17 17:31:54 compute-0 nova_compute[186479]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.404 186483 DEBUG nova.compute.manager [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Preparing to wait for external event network-vif-plugged-df8be582-0d74-458c-bb58-167d804479f1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.405 186483 DEBUG oslo_concurrency.lockutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "9d30324a-c03a-43f6-a245-3ac4f0923693-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.405 186483 DEBUG oslo_concurrency.lockutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "9d30324a-c03a-43f6-a245-3ac4f0923693-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.405 186483 DEBUG oslo_concurrency.lockutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "9d30324a-c03a-43f6-a245-3ac4f0923693-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.406 186483 DEBUG nova.virt.libvirt.vif [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:31:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-570000635',display_name='tempest-TestNetworkBasicOps-server-570000635',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-570000635',id=5,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGq2gHBT44SLjiFzk5lM6oqU2OGb0TC/GXH2UJFBIIhXQa3lvYj945Uq65Ff28vqy0DJ2ucPh3a1eBAM0ZIFsF8QTss6Z3Z0k/AY0USQCSIApBIdTevxLskGIJlWXYA/Qw==',key_name='tempest-TestNetworkBasicOps-1876004402',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-9k0p090f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:31:51Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=9d30324a-c03a-43f6-a245-3ac4f0923693,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df8be582-0d74-458c-bb58-167d804479f1", "address": "fa:16:3e:bf:64:da", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf8be582-0d", "ovs_interfaceid": "df8be582-0d74-458c-bb58-167d804479f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.406 186483 DEBUG nova.network.os_vif_util [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "df8be582-0d74-458c-bb58-167d804479f1", "address": "fa:16:3e:bf:64:da", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf8be582-0d", "ovs_interfaceid": "df8be582-0d74-458c-bb58-167d804479f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.407 186483 DEBUG nova.network.os_vif_util [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bf:64:da,bridge_name='br-int',has_traffic_filtering=True,id=df8be582-0d74-458c-bb58-167d804479f1,network=Network(6ea18f88-05f8-477a-96b4-268feae14237),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf8be582-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.407 186483 DEBUG os_vif [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:64:da,bridge_name='br-int',has_traffic_filtering=True,id=df8be582-0d74-458c-bb58-167d804479f1,network=Network(6ea18f88-05f8-477a-96b4-268feae14237),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf8be582-0d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.408 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.408 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.408 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.412 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.412 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf8be582-0d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.412 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdf8be582-0d, col_values=(('external_ids', {'iface-id': 'df8be582-0d74-458c-bb58-167d804479f1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bf:64:da', 'vm-uuid': '9d30324a-c03a-43f6-a245-3ac4f0923693'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.414 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:54 compute-0 NetworkManager[56323]: <info>  [1771349514.4152] manager: (tapdf8be582-0d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.416 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.421 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.422 186483 INFO os_vif [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bf:64:da,bridge_name='br-int',has_traffic_filtering=True,id=df8be582-0d74-458c-bb58-167d804479f1,network=Network(6ea18f88-05f8-477a-96b4-268feae14237),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf8be582-0d')
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.494 186483 DEBUG nova.virt.libvirt.driver [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.494 186483 DEBUG nova.virt.libvirt.driver [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.495 186483 DEBUG nova.virt.libvirt.driver [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No VIF found with MAC fa:16:3e:bf:64:da, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.495 186483 INFO nova.virt.libvirt.driver [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Using config drive
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.799 186483 INFO nova.virt.libvirt.driver [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Creating config drive at /var/lib/nova/instances/9d30324a-c03a-43f6-a245-3ac4f0923693/disk.config
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.805 186483 DEBUG oslo_concurrency.processutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9d30324a-c03a-43f6-a245-3ac4f0923693/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpega2z4jf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.933 186483 DEBUG oslo_concurrency.processutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9d30324a-c03a-43f6-a245-3ac4f0923693/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpega2z4jf" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:31:54 compute-0 kernel: tapdf8be582-0d: entered promiscuous mode
Feb 17 17:31:54 compute-0 NetworkManager[56323]: <info>  [1771349514.9883] manager: (tapdf8be582-0d): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.988 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:54 compute-0 ovn_controller[96568]: 2026-02-17T17:31:54Z|00072|binding|INFO|Claiming lport df8be582-0d74-458c-bb58-167d804479f1 for this chassis.
Feb 17 17:31:54 compute-0 ovn_controller[96568]: 2026-02-17T17:31:54Z|00073|binding|INFO|df8be582-0d74-458c-bb58-167d804479f1: Claiming fa:16:3e:bf:64:da 10.100.0.10
Feb 17 17:31:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:54.997 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:64:da 10.100.0.10'], port_security=['fa:16:3e:bf:64:da 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9d30324a-c03a-43f6-a245-3ac4f0923693', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ea18f88-05f8-477a-96b4-268feae14237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '278ea554-92a8-4674-9f8b-7cea90c3d564', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d03eeed7-622b-4037-b9a8-17a167c6170b, chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=df8be582-0d74-458c-bb58-167d804479f1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:31:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:54.998 105898 INFO neutron.agent.ovn.metadata.agent [-] Port df8be582-0d74-458c-bb58-167d804479f1 in datapath 6ea18f88-05f8-477a-96b4-268feae14237 bound to our chassis
Feb 17 17:31:54 compute-0 ovn_controller[96568]: 2026-02-17T17:31:54Z|00074|binding|INFO|Setting lport df8be582-0d74-458c-bb58-167d804479f1 up in Southbound
Feb 17 17:31:54 compute-0 ovn_controller[96568]: 2026-02-17T17:31:54Z|00075|binding|INFO|Setting lport df8be582-0d74-458c-bb58-167d804479f1 ovn-installed in OVS
Feb 17 17:31:54 compute-0 nova_compute[186479]: 2026-02-17 17:31:54.999 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:55 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:54.999 105898 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6ea18f88-05f8-477a-96b4-268feae14237
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.004 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:55 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:55.015 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[1b7a95fb-a844-492d-8431-8b52dbfffac0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:31:55 compute-0 systemd-machined[155877]: New machine qemu-5-instance-00000005.
Feb 17 17:31:55 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000005.
Feb 17 17:31:55 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:55.043 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[e988c53b-bcef-4128-b078-9a8f6a864880]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:31:55 compute-0 systemd-udevd[217074]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:31:55 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:55.048 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[64314e04-03c7-4935-afdc-b0917f0fed9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:31:55 compute-0 NetworkManager[56323]: <info>  [1771349515.0616] device (tapdf8be582-0d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 17 17:31:55 compute-0 NetworkManager[56323]: <info>  [1771349515.0625] device (tapdf8be582-0d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 17 17:31:55 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:55.074 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[0d56238a-e213-4dde-a085-4370c87d4b75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:31:55 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:55.091 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[69d35553-32d6-4ca5-b1ec-0234cbcb4b25]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6ea18f88-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:4d:e5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 323696, 'reachable_time': 29795, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217084, 'error': None, 'target': 'ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:31:55 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:55.105 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[e2a28292-5b0d-472e-8ced-f429cc15d03e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6ea18f88-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 323707, 'tstamp': 323707}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217086, 'error': None, 'target': 'ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6ea18f88-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 323709, 'tstamp': 323709}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217086, 'error': None, 'target': 'ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:31:55 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:55.107 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6ea18f88-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.109 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:55 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:55.110 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6ea18f88-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:31:55 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:55.110 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:31:55 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:55.111 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6ea18f88-00, col_values=(('external_ids', {'iface-id': 'e24d2411-f76f-4a15-8650-ce90df03012a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:31:55 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:31:55.111 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.476 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349515.4761777, 9d30324a-c03a-43f6-a245-3ac4f0923693 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.477 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] VM Started (Lifecycle Event)
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.508 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.512 186483 DEBUG nova.compute.manager [req-742f0211-411f-440c-9a79-abcd9d628def req-8c88a715-deaf-4793-ac03-d12e4e167955 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Received event network-vif-plugged-df8be582-0d74-458c-bb58-167d804479f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.513 186483 DEBUG oslo_concurrency.lockutils [req-742f0211-411f-440c-9a79-abcd9d628def req-8c88a715-deaf-4793-ac03-d12e4e167955 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "9d30324a-c03a-43f6-a245-3ac4f0923693-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.513 186483 DEBUG oslo_concurrency.lockutils [req-742f0211-411f-440c-9a79-abcd9d628def req-8c88a715-deaf-4793-ac03-d12e4e167955 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "9d30324a-c03a-43f6-a245-3ac4f0923693-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.514 186483 DEBUG oslo_concurrency.lockutils [req-742f0211-411f-440c-9a79-abcd9d628def req-8c88a715-deaf-4793-ac03-d12e4e167955 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "9d30324a-c03a-43f6-a245-3ac4f0923693-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.514 186483 DEBUG nova.compute.manager [req-742f0211-411f-440c-9a79-abcd9d628def req-8c88a715-deaf-4793-ac03-d12e4e167955 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Processing event network-vif-plugged-df8be582-0d74-458c-bb58-167d804479f1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.515 186483 DEBUG nova.compute.manager [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.522 186483 DEBUG nova.virt.libvirt.driver [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.522 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349515.4763243, 9d30324a-c03a-43f6-a245-3ac4f0923693 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.523 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] VM Paused (Lifecycle Event)
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.528 186483 INFO nova.virt.libvirt.driver [-] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Instance spawned successfully.
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.529 186483 DEBUG nova.virt.libvirt.driver [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.547 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.554 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349515.5213158, 9d30324a-c03a-43f6-a245-3ac4f0923693 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.555 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] VM Resumed (Lifecycle Event)
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.560 186483 DEBUG nova.virt.libvirt.driver [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.561 186483 DEBUG nova.virt.libvirt.driver [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.561 186483 DEBUG nova.virt.libvirt.driver [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.562 186483 DEBUG nova.virt.libvirt.driver [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.562 186483 DEBUG nova.virt.libvirt.driver [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.563 186483 DEBUG nova.virt.libvirt.driver [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.585 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.589 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.613 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.615 186483 DEBUG nova.network.neutron [req-b242d726-13a8-4224-8676-4e4639a916cf req-2d32fbfd-8f8b-49f5-9c27-410a271a13af 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Updated VIF entry in instance network info cache for port df8be582-0d74-458c-bb58-167d804479f1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.616 186483 DEBUG nova.network.neutron [req-b242d726-13a8-4224-8676-4e4639a916cf req-2d32fbfd-8f8b-49f5-9c27-410a271a13af 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Updating instance_info_cache with network_info: [{"id": "df8be582-0d74-458c-bb58-167d804479f1", "address": "fa:16:3e:bf:64:da", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf8be582-0d", "ovs_interfaceid": "df8be582-0d74-458c-bb58-167d804479f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.620 186483 INFO nova.compute.manager [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Took 4.55 seconds to spawn the instance on the hypervisor.
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.621 186483 DEBUG nova.compute.manager [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.631 186483 DEBUG oslo_concurrency.lockutils [req-b242d726-13a8-4224-8676-4e4639a916cf req-2d32fbfd-8f8b-49f5-9c27-410a271a13af 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-9d30324a-c03a-43f6-a245-3ac4f0923693" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.681 186483 INFO nova.compute.manager [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Took 5.00 seconds to build instance.
Feb 17 17:31:55 compute-0 nova_compute[186479]: 2026-02-17 17:31:55.694 186483 DEBUG oslo_concurrency.lockutils [None req-5d33cc08-9726-4707-9fc3-72e050bc9bbf 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "9d30324a-c03a-43f6-a245-3ac4f0923693" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:31:57 compute-0 nova_compute[186479]: 2026-02-17 17:31:57.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:31:57 compute-0 nova_compute[186479]: 2026-02-17 17:31:57.583 186483 DEBUG nova.compute.manager [req-2cebe1d8-96e6-4e16-b235-c9e93c4af2bb req-55afe4e1-23e1-47a7-9ca0-19c30eb5aa00 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Received event network-vif-plugged-df8be582-0d74-458c-bb58-167d804479f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:31:57 compute-0 nova_compute[186479]: 2026-02-17 17:31:57.584 186483 DEBUG oslo_concurrency.lockutils [req-2cebe1d8-96e6-4e16-b235-c9e93c4af2bb req-55afe4e1-23e1-47a7-9ca0-19c30eb5aa00 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "9d30324a-c03a-43f6-a245-3ac4f0923693-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:31:57 compute-0 nova_compute[186479]: 2026-02-17 17:31:57.584 186483 DEBUG oslo_concurrency.lockutils [req-2cebe1d8-96e6-4e16-b235-c9e93c4af2bb req-55afe4e1-23e1-47a7-9ca0-19c30eb5aa00 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "9d30324a-c03a-43f6-a245-3ac4f0923693-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:31:57 compute-0 nova_compute[186479]: 2026-02-17 17:31:57.584 186483 DEBUG oslo_concurrency.lockutils [req-2cebe1d8-96e6-4e16-b235-c9e93c4af2bb req-55afe4e1-23e1-47a7-9ca0-19c30eb5aa00 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "9d30324a-c03a-43f6-a245-3ac4f0923693-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:31:57 compute-0 nova_compute[186479]: 2026-02-17 17:31:57.584 186483 DEBUG nova.compute.manager [req-2cebe1d8-96e6-4e16-b235-c9e93c4af2bb req-55afe4e1-23e1-47a7-9ca0-19c30eb5aa00 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] No waiting events found dispatching network-vif-plugged-df8be582-0d74-458c-bb58-167d804479f1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:31:57 compute-0 nova_compute[186479]: 2026-02-17 17:31:57.585 186483 WARNING nova.compute.manager [req-2cebe1d8-96e6-4e16-b235-c9e93c4af2bb req-55afe4e1-23e1-47a7-9ca0-19c30eb5aa00 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Received unexpected event network-vif-plugged-df8be582-0d74-458c-bb58-167d804479f1 for instance with vm_state active and task_state None.
Feb 17 17:31:57 compute-0 podman[217094]: 2026-02-17 17:31:57.740384959 +0000 UTC m=+0.070638595 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 17 17:31:58 compute-0 nova_compute[186479]: 2026-02-17 17:31:58.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:31:58 compute-0 nova_compute[186479]: 2026-02-17 17:31:58.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:31:58 compute-0 nova_compute[186479]: 2026-02-17 17:31:58.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:31:58 compute-0 nova_compute[186479]: 2026-02-17 17:31:58.557 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:59 compute-0 nova_compute[186479]: 2026-02-17 17:31:59.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:31:59 compute-0 nova_compute[186479]: 2026-02-17 17:31:59.416 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:31:59 compute-0 nova_compute[186479]: 2026-02-17 17:31:59.675 186483 DEBUG nova.compute.manager [req-67ec4917-eb58-4161-95d0-0c4118f3c3d6 req-d6de9ad9-c920-4b45-937d-02cb5f36c4db 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Received event network-changed-df8be582-0d74-458c-bb58-167d804479f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:31:59 compute-0 nova_compute[186479]: 2026-02-17 17:31:59.676 186483 DEBUG nova.compute.manager [req-67ec4917-eb58-4161-95d0-0c4118f3c3d6 req-d6de9ad9-c920-4b45-937d-02cb5f36c4db 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Refreshing instance network info cache due to event network-changed-df8be582-0d74-458c-bb58-167d804479f1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:31:59 compute-0 nova_compute[186479]: 2026-02-17 17:31:59.676 186483 DEBUG oslo_concurrency.lockutils [req-67ec4917-eb58-4161-95d0-0c4118f3c3d6 req-d6de9ad9-c920-4b45-937d-02cb5f36c4db 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-9d30324a-c03a-43f6-a245-3ac4f0923693" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:31:59 compute-0 nova_compute[186479]: 2026-02-17 17:31:59.677 186483 DEBUG oslo_concurrency.lockutils [req-67ec4917-eb58-4161-95d0-0c4118f3c3d6 req-d6de9ad9-c920-4b45-937d-02cb5f36c4db 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-9d30324a-c03a-43f6-a245-3ac4f0923693" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:31:59 compute-0 nova_compute[186479]: 2026-02-17 17:31:59.677 186483 DEBUG nova.network.neutron [req-67ec4917-eb58-4161-95d0-0c4118f3c3d6 req-d6de9ad9-c920-4b45-937d-02cb5f36c4db 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Refreshing network info cache for port df8be582-0d74-458c-bb58-167d804479f1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.298 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.327 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.328 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.329 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.329 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.401 186483 DEBUG oslo_concurrency.processutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af4e641d-d312-4f39-878b-fd7ddc3984df/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.451 186483 DEBUG oslo_concurrency.processutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af4e641d-d312-4f39-878b-fd7ddc3984df/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.452 186483 DEBUG oslo_concurrency.processutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af4e641d-d312-4f39-878b-fd7ddc3984df/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.498 186483 DEBUG oslo_concurrency.processutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af4e641d-d312-4f39-878b-fd7ddc3984df/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.504 186483 DEBUG oslo_concurrency.processutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d30324a-c03a-43f6-a245-3ac4f0923693/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.550 186483 DEBUG oslo_concurrency.processutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d30324a-c03a-43f6-a245-3ac4f0923693/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.551 186483 DEBUG oslo_concurrency.processutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d30324a-c03a-43f6-a245-3ac4f0923693/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.595 186483 DEBUG oslo_concurrency.processutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9d30324a-c03a-43f6-a245-3ac4f0923693/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.730 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.731 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5468MB free_disk=73.17767715454102GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.732 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.732 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.809 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Instance af4e641d-d312-4f39-878b-fd7ddc3984df actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.810 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Instance 9d30324a-c03a-43f6-a245-3ac4f0923693 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.810 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.810 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.858 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.873 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.893 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:32:00 compute-0 nova_compute[186479]: 2026-02-17 17:32:00.895 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:32:01 compute-0 nova_compute[186479]: 2026-02-17 17:32:01.482 186483 DEBUG nova.network.neutron [req-67ec4917-eb58-4161-95d0-0c4118f3c3d6 req-d6de9ad9-c920-4b45-937d-02cb5f36c4db 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Updated VIF entry in instance network info cache for port df8be582-0d74-458c-bb58-167d804479f1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:32:01 compute-0 nova_compute[186479]: 2026-02-17 17:32:01.484 186483 DEBUG nova.network.neutron [req-67ec4917-eb58-4161-95d0-0c4118f3c3d6 req-d6de9ad9-c920-4b45-937d-02cb5f36c4db 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Updating instance_info_cache with network_info: [{"id": "df8be582-0d74-458c-bb58-167d804479f1", "address": "fa:16:3e:bf:64:da", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf8be582-0d", "ovs_interfaceid": "df8be582-0d74-458c-bb58-167d804479f1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:32:01 compute-0 nova_compute[186479]: 2026-02-17 17:32:01.504 186483 DEBUG oslo_concurrency.lockutils [req-67ec4917-eb58-4161-95d0-0c4118f3c3d6 req-d6de9ad9-c920-4b45-937d-02cb5f36c4db 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-9d30324a-c03a-43f6-a245-3ac4f0923693" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:32:03 compute-0 nova_compute[186479]: 2026-02-17 17:32:03.562 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:03 compute-0 nova_compute[186479]: 2026-02-17 17:32:03.895 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:32:03 compute-0 nova_compute[186479]: 2026-02-17 17:32:03.896 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:32:03 compute-0 nova_compute[186479]: 2026-02-17 17:32:03.897 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 17 17:32:04 compute-0 nova_compute[186479]: 2026-02-17 17:32:04.060 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "refresh_cache-af4e641d-d312-4f39-878b-fd7ddc3984df" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:32:04 compute-0 nova_compute[186479]: 2026-02-17 17:32:04.060 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquired lock "refresh_cache-af4e641d-d312-4f39-878b-fd7ddc3984df" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:32:04 compute-0 nova_compute[186479]: 2026-02-17 17:32:04.061 186483 DEBUG nova.network.neutron [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 17 17:32:04 compute-0 nova_compute[186479]: 2026-02-17 17:32:04.061 186483 DEBUG nova.objects.instance [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lazy-loading 'info_cache' on Instance uuid af4e641d-d312-4f39-878b-fd7ddc3984df obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:32:04 compute-0 nova_compute[186479]: 2026-02-17 17:32:04.418 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:05 compute-0 nova_compute[186479]: 2026-02-17 17:32:05.146 186483 DEBUG nova.network.neutron [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Updating instance_info_cache with network_info: [{"id": "c8432488-9dc7-47f2-88da-abda34b62ca5", "address": "fa:16:3e:f2:86:09", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8432488-9d", "ovs_interfaceid": "c8432488-9dc7-47f2-88da-abda34b62ca5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:32:05 compute-0 nova_compute[186479]: 2026-02-17 17:32:05.161 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Releasing lock "refresh_cache-af4e641d-d312-4f39-878b-fd7ddc3984df" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:32:05 compute-0 nova_compute[186479]: 2026-02-17 17:32:05.162 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 17 17:32:05 compute-0 nova_compute[186479]: 2026-02-17 17:32:05.565 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:32:06 compute-0 podman[217144]: 2026-02-17 17:32:06.75155312 +0000 UTC m=+0.080946297 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 17 17:32:07 compute-0 ovn_controller[96568]: 2026-02-17T17:32:07Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bf:64:da 10.100.0.10
Feb 17 17:32:07 compute-0 ovn_controller[96568]: 2026-02-17T17:32:07Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bf:64:da 10.100.0.10
Feb 17 17:32:07 compute-0 sshd-session[217172]: Received disconnect from 91.224.92.54 port 42364:11:  [preauth]
Feb 17 17:32:07 compute-0 sshd-session[217172]: Disconnected from authenticating user root 91.224.92.54 port 42364 [preauth]
Feb 17 17:32:08 compute-0 nova_compute[186479]: 2026-02-17 17:32:08.564 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:09 compute-0 nova_compute[186479]: 2026-02-17 17:32:09.452 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:09 compute-0 sshd-session[217174]: Invalid user test from 209.38.233.161 port 60790
Feb 17 17:32:09 compute-0 podman[217176]: 2026-02-17 17:32:09.731630182 +0000 UTC m=+0.066135813 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 17 17:32:09 compute-0 sshd-session[217174]: Connection closed by invalid user test 209.38.233.161 port 60790 [preauth]
Feb 17 17:32:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:10.949 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:32:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:10.950 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:32:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:10.951 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:32:13 compute-0 nova_compute[186479]: 2026-02-17 17:32:13.566 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:13 compute-0 nova_compute[186479]: 2026-02-17 17:32:13.990 186483 INFO nova.compute.manager [None req-781b50b2-14a9-4ed3-87d0-edef694e7c2e 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Get console output
Feb 17 17:32:13 compute-0 nova_compute[186479]: 2026-02-17 17:32:13.995 215112 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.311 186483 DEBUG oslo_concurrency.lockutils [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "9d30324a-c03a-43f6-a245-3ac4f0923693" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.312 186483 DEBUG oslo_concurrency.lockutils [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "9d30324a-c03a-43f6-a245-3ac4f0923693" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.312 186483 DEBUG oslo_concurrency.lockutils [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "9d30324a-c03a-43f6-a245-3ac4f0923693-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.313 186483 DEBUG oslo_concurrency.lockutils [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "9d30324a-c03a-43f6-a245-3ac4f0923693-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.313 186483 DEBUG oslo_concurrency.lockutils [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "9d30324a-c03a-43f6-a245-3ac4f0923693-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.315 186483 INFO nova.compute.manager [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Terminating instance
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.316 186483 DEBUG nova.compute.manager [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 17 17:32:14 compute-0 kernel: tapdf8be582-0d (unregistering): left promiscuous mode
Feb 17 17:32:14 compute-0 NetworkManager[56323]: <info>  [1771349534.3391] device (tapdf8be582-0d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.353 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:14 compute-0 ovn_controller[96568]: 2026-02-17T17:32:14Z|00076|binding|INFO|Releasing lport df8be582-0d74-458c-bb58-167d804479f1 from this chassis (sb_readonly=0)
Feb 17 17:32:14 compute-0 ovn_controller[96568]: 2026-02-17T17:32:14Z|00077|binding|INFO|Setting lport df8be582-0d74-458c-bb58-167d804479f1 down in Southbound
Feb 17 17:32:14 compute-0 ovn_controller[96568]: 2026-02-17T17:32:14Z|00078|binding|INFO|Removing iface tapdf8be582-0d ovn-installed in OVS
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.357 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.362 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:14.364 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:64:da 10.100.0.10'], port_security=['fa:16:3e:bf:64:da 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9d30324a-c03a-43f6-a245-3ac4f0923693', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ea18f88-05f8-477a-96b4-268feae14237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '278ea554-92a8-4674-9f8b-7cea90c3d564', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.191'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d03eeed7-622b-4037-b9a8-17a167c6170b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=df8be582-0d74-458c-bb58-167d804479f1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:32:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:14.367 105898 INFO neutron.agent.ovn.metadata.agent [-] Port df8be582-0d74-458c-bb58-167d804479f1 in datapath 6ea18f88-05f8-477a-96b4-268feae14237 unbound from our chassis
Feb 17 17:32:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:14.370 105898 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6ea18f88-05f8-477a-96b4-268feae14237
Feb 17 17:32:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:14.383 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[f03a0885-341f-49d5-8ed8-5888db3f0f5c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:14 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Deactivated successfully.
Feb 17 17:32:14 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Consumed 12.374s CPU time.
Feb 17 17:32:14 compute-0 systemd-machined[155877]: Machine qemu-5-instance-00000005 terminated.
Feb 17 17:32:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:14.406 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[28d27218-3152-4b2d-a0d4-f019bf39d14e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:14.410 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[149bd36a-8404-46fe-8a60-3193024a9682]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:14.437 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[05e6cb54-eff9-4f0c-9b5d-ec7156d024f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.454 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:14.457 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[20dd6a4c-63fb-4688-ae0f-552cd4fdd1c4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6ea18f88-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:4d:e5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 323696, 'reachable_time': 29795, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217211, 'error': None, 'target': 'ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:14.475 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[bf18e360-2d94-4ad8-ab07-66f3c782c750]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6ea18f88-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 323707, 'tstamp': 323707}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217212, 'error': None, 'target': 'ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6ea18f88-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 323709, 'tstamp': 323709}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217212, 'error': None, 'target': 'ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:14.478 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6ea18f88-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.480 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.484 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:14.484 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6ea18f88-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:32:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:14.485 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:32:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:14.486 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6ea18f88-00, col_values=(('external_ids', {'iface-id': 'e24d2411-f76f-4a15-8650-ce90df03012a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:32:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:14.486 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.572 186483 INFO nova.virt.libvirt.driver [-] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Instance destroyed successfully.
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.573 186483 DEBUG nova.objects.instance [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'resources' on Instance uuid 9d30324a-c03a-43f6-a245-3ac4f0923693 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.588 186483 DEBUG nova.virt.libvirt.vif [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:31:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-570000635',display_name='tempest-TestNetworkBasicOps-server-570000635',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-570000635',id=5,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGq2gHBT44SLjiFzk5lM6oqU2OGb0TC/GXH2UJFBIIhXQa3lvYj945Uq65Ff28vqy0DJ2ucPh3a1eBAM0ZIFsF8QTss6Z3Z0k/AY0USQCSIApBIdTevxLskGIJlWXYA/Qw==',key_name='tempest-TestNetworkBasicOps-1876004402',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:31:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-9k0p090f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:31:55Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=9d30324a-c03a-43f6-a245-3ac4f0923693,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "df8be582-0d74-458c-bb58-167d804479f1", "address": "fa:16:3e:bf:64:da", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf8be582-0d", "ovs_interfaceid": "df8be582-0d74-458c-bb58-167d804479f1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.588 186483 DEBUG nova.network.os_vif_util [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "df8be582-0d74-458c-bb58-167d804479f1", "address": "fa:16:3e:bf:64:da", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf8be582-0d", "ovs_interfaceid": "df8be582-0d74-458c-bb58-167d804479f1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.590 186483 DEBUG nova.network.os_vif_util [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bf:64:da,bridge_name='br-int',has_traffic_filtering=True,id=df8be582-0d74-458c-bb58-167d804479f1,network=Network(6ea18f88-05f8-477a-96b4-268feae14237),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf8be582-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.590 186483 DEBUG os_vif [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bf:64:da,bridge_name='br-int',has_traffic_filtering=True,id=df8be582-0d74-458c-bb58-167d804479f1,network=Network(6ea18f88-05f8-477a-96b4-268feae14237),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf8be582-0d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.594 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.595 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf8be582-0d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.597 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.600 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.603 186483 INFO os_vif [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bf:64:da,bridge_name='br-int',has_traffic_filtering=True,id=df8be582-0d74-458c-bb58-167d804479f1,network=Network(6ea18f88-05f8-477a-96b4-268feae14237),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf8be582-0d')
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.604 186483 INFO nova.virt.libvirt.driver [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Deleting instance files /var/lib/nova/instances/9d30324a-c03a-43f6-a245-3ac4f0923693_del
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.605 186483 INFO nova.virt.libvirt.driver [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Deletion of /var/lib/nova/instances/9d30324a-c03a-43f6-a245-3ac4f0923693_del complete
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.668 186483 INFO nova.compute.manager [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Took 0.35 seconds to destroy the instance on the hypervisor.
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.669 186483 DEBUG oslo.service.loopingcall [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.669 186483 DEBUG nova.compute.manager [-] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 17 17:32:14 compute-0 nova_compute[186479]: 2026-02-17 17:32:14.669 186483 DEBUG nova.network.neutron [-] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 17 17:32:15 compute-0 podman[217231]: 2026-02-17 17:32:15.775670527 +0000 UTC m=+0.105088629 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., version=9.7, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1770267347, config_id=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9)
Feb 17 17:32:17 compute-0 nova_compute[186479]: 2026-02-17 17:32:17.464 186483 DEBUG nova.compute.manager [req-efd16131-0cfb-4db3-a06f-a810c00f8690 req-228dd4a4-52cc-486c-b8f3-dec53920119b 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Received event network-vif-unplugged-df8be582-0d74-458c-bb58-167d804479f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:32:17 compute-0 nova_compute[186479]: 2026-02-17 17:32:17.464 186483 DEBUG oslo_concurrency.lockutils [req-efd16131-0cfb-4db3-a06f-a810c00f8690 req-228dd4a4-52cc-486c-b8f3-dec53920119b 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "9d30324a-c03a-43f6-a245-3ac4f0923693-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:32:17 compute-0 nova_compute[186479]: 2026-02-17 17:32:17.465 186483 DEBUG oslo_concurrency.lockutils [req-efd16131-0cfb-4db3-a06f-a810c00f8690 req-228dd4a4-52cc-486c-b8f3-dec53920119b 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "9d30324a-c03a-43f6-a245-3ac4f0923693-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:32:17 compute-0 nova_compute[186479]: 2026-02-17 17:32:17.465 186483 DEBUG oslo_concurrency.lockutils [req-efd16131-0cfb-4db3-a06f-a810c00f8690 req-228dd4a4-52cc-486c-b8f3-dec53920119b 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "9d30324a-c03a-43f6-a245-3ac4f0923693-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:32:17 compute-0 nova_compute[186479]: 2026-02-17 17:32:17.465 186483 DEBUG nova.compute.manager [req-efd16131-0cfb-4db3-a06f-a810c00f8690 req-228dd4a4-52cc-486c-b8f3-dec53920119b 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] No waiting events found dispatching network-vif-unplugged-df8be582-0d74-458c-bb58-167d804479f1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:32:17 compute-0 nova_compute[186479]: 2026-02-17 17:32:17.465 186483 DEBUG nova.compute.manager [req-efd16131-0cfb-4db3-a06f-a810c00f8690 req-228dd4a4-52cc-486c-b8f3-dec53920119b 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Received event network-vif-unplugged-df8be582-0d74-458c-bb58-167d804479f1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 17 17:32:17 compute-0 nova_compute[186479]: 2026-02-17 17:32:17.912 186483 DEBUG nova.network.neutron [-] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:32:17 compute-0 nova_compute[186479]: 2026-02-17 17:32:17.932 186483 INFO nova.compute.manager [-] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Took 3.26 seconds to deallocate network for instance.
Feb 17 17:32:17 compute-0 nova_compute[186479]: 2026-02-17 17:32:17.989 186483 DEBUG oslo_concurrency.lockutils [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:32:17 compute-0 nova_compute[186479]: 2026-02-17 17:32:17.990 186483 DEBUG oslo_concurrency.lockutils [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:32:18 compute-0 nova_compute[186479]: 2026-02-17 17:32:18.073 186483 DEBUG nova.compute.provider_tree [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:32:18 compute-0 nova_compute[186479]: 2026-02-17 17:32:18.090 186483 DEBUG nova.scheduler.client.report [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:32:18 compute-0 nova_compute[186479]: 2026-02-17 17:32:18.114 186483 DEBUG oslo_concurrency.lockutils [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:32:18 compute-0 nova_compute[186479]: 2026-02-17 17:32:18.140 186483 INFO nova.scheduler.client.report [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Deleted allocations for instance 9d30324a-c03a-43f6-a245-3ac4f0923693
Feb 17 17:32:18 compute-0 nova_compute[186479]: 2026-02-17 17:32:18.211 186483 DEBUG oslo_concurrency.lockutils [None req-981fb3a2-6d0b-4a6b-872d-d7c2c39d7626 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "9d30324a-c03a-43f6-a245-3ac4f0923693" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.899s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:32:18 compute-0 nova_compute[186479]: 2026-02-17 17:32:18.568 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:19 compute-0 nova_compute[186479]: 2026-02-17 17:32:19.546 186483 DEBUG nova.compute.manager [req-b46b97df-bf45-4bbe-aa30-5fac0ec33160 req-58f2d823-c166-433f-8822-9cd114b32a50 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Received event network-vif-deleted-df8be582-0d74-458c-bb58-167d804479f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:32:19 compute-0 nova_compute[186479]: 2026-02-17 17:32:19.547 186483 DEBUG nova.compute.manager [req-b46b97df-bf45-4bbe-aa30-5fac0ec33160 req-58f2d823-c166-433f-8822-9cd114b32a50 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Received event network-vif-plugged-df8be582-0d74-458c-bb58-167d804479f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:32:19 compute-0 nova_compute[186479]: 2026-02-17 17:32:19.548 186483 DEBUG oslo_concurrency.lockutils [req-b46b97df-bf45-4bbe-aa30-5fac0ec33160 req-58f2d823-c166-433f-8822-9cd114b32a50 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "9d30324a-c03a-43f6-a245-3ac4f0923693-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:32:19 compute-0 nova_compute[186479]: 2026-02-17 17:32:19.548 186483 DEBUG oslo_concurrency.lockutils [req-b46b97df-bf45-4bbe-aa30-5fac0ec33160 req-58f2d823-c166-433f-8822-9cd114b32a50 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "9d30324a-c03a-43f6-a245-3ac4f0923693-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:32:19 compute-0 nova_compute[186479]: 2026-02-17 17:32:19.549 186483 DEBUG oslo_concurrency.lockutils [req-b46b97df-bf45-4bbe-aa30-5fac0ec33160 req-58f2d823-c166-433f-8822-9cd114b32a50 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "9d30324a-c03a-43f6-a245-3ac4f0923693-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:32:19 compute-0 nova_compute[186479]: 2026-02-17 17:32:19.549 186483 DEBUG nova.compute.manager [req-b46b97df-bf45-4bbe-aa30-5fac0ec33160 req-58f2d823-c166-433f-8822-9cd114b32a50 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] No waiting events found dispatching network-vif-plugged-df8be582-0d74-458c-bb58-167d804479f1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:32:19 compute-0 nova_compute[186479]: 2026-02-17 17:32:19.549 186483 WARNING nova.compute.manager [req-b46b97df-bf45-4bbe-aa30-5fac0ec33160 req-58f2d823-c166-433f-8822-9cd114b32a50 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Received unexpected event network-vif-plugged-df8be582-0d74-458c-bb58-167d804479f1 for instance with vm_state deleted and task_state None.
Feb 17 17:32:19 compute-0 nova_compute[186479]: 2026-02-17 17:32:19.597 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:19 compute-0 nova_compute[186479]: 2026-02-17 17:32:19.916 186483 DEBUG oslo_concurrency.lockutils [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "af4e641d-d312-4f39-878b-fd7ddc3984df" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:32:19 compute-0 nova_compute[186479]: 2026-02-17 17:32:19.917 186483 DEBUG oslo_concurrency.lockutils [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "af4e641d-d312-4f39-878b-fd7ddc3984df" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:32:19 compute-0 nova_compute[186479]: 2026-02-17 17:32:19.917 186483 DEBUG oslo_concurrency.lockutils [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "af4e641d-d312-4f39-878b-fd7ddc3984df-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:32:19 compute-0 nova_compute[186479]: 2026-02-17 17:32:19.917 186483 DEBUG oslo_concurrency.lockutils [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "af4e641d-d312-4f39-878b-fd7ddc3984df-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:32:19 compute-0 nova_compute[186479]: 2026-02-17 17:32:19.917 186483 DEBUG oslo_concurrency.lockutils [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "af4e641d-d312-4f39-878b-fd7ddc3984df-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:32:19 compute-0 nova_compute[186479]: 2026-02-17 17:32:19.919 186483 INFO nova.compute.manager [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Terminating instance
Feb 17 17:32:19 compute-0 nova_compute[186479]: 2026-02-17 17:32:19.920 186483 DEBUG nova.compute.manager [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 17 17:32:19 compute-0 kernel: tapc8432488-9d (unregistering): left promiscuous mode
Feb 17 17:32:19 compute-0 NetworkManager[56323]: <info>  [1771349539.9460] device (tapc8432488-9d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 17 17:32:19 compute-0 nova_compute[186479]: 2026-02-17 17:32:19.952 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:19 compute-0 ovn_controller[96568]: 2026-02-17T17:32:19Z|00079|binding|INFO|Releasing lport c8432488-9dc7-47f2-88da-abda34b62ca5 from this chassis (sb_readonly=0)
Feb 17 17:32:19 compute-0 ovn_controller[96568]: 2026-02-17T17:32:19Z|00080|binding|INFO|Setting lport c8432488-9dc7-47f2-88da-abda34b62ca5 down in Southbound
Feb 17 17:32:19 compute-0 ovn_controller[96568]: 2026-02-17T17:32:19Z|00081|binding|INFO|Removing iface tapc8432488-9d ovn-installed in OVS
Feb 17 17:32:19 compute-0 nova_compute[186479]: 2026-02-17 17:32:19.955 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:19 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:19.960 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:86:09 10.100.0.4'], port_security=['fa:16:3e:f2:86:09 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'af4e641d-d312-4f39-878b-fd7ddc3984df', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ea18f88-05f8-477a-96b4-268feae14237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '823b55fc-0d55-46d3-955d-e46832564672', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d03eeed7-622b-4037-b9a8-17a167c6170b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=c8432488-9dc7-47f2-88da-abda34b62ca5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:32:19 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:19.961 105898 INFO neutron.agent.ovn.metadata.agent [-] Port c8432488-9dc7-47f2-88da-abda34b62ca5 in datapath 6ea18f88-05f8-477a-96b4-268feae14237 unbound from our chassis
Feb 17 17:32:19 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:19.962 105898 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6ea18f88-05f8-477a-96b4-268feae14237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 17 17:32:19 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:19.963 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[4a00dc13-229b-4277-a929-cbed47fa9181]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:19 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:19.963 105898 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237 namespace which is not needed anymore
Feb 17 17:32:19 compute-0 nova_compute[186479]: 2026-02-17 17:32:19.964 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:20 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Deactivated successfully.
Feb 17 17:32:20 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Consumed 14.297s CPU time.
Feb 17 17:32:20 compute-0 systemd-machined[155877]: Machine qemu-4-instance-00000004 terminated.
Feb 17 17:32:20 compute-0 neutron-haproxy-ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237[216835]: [NOTICE]   (216867) : haproxy version is 2.8.14-c23fe91
Feb 17 17:32:20 compute-0 neutron-haproxy-ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237[216835]: [NOTICE]   (216867) : path to executable is /usr/sbin/haproxy
Feb 17 17:32:20 compute-0 neutron-haproxy-ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237[216835]: [WARNING]  (216867) : Exiting Master process...
Feb 17 17:32:20 compute-0 neutron-haproxy-ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237[216835]: [ALERT]    (216867) : Current worker (216870) exited with code 143 (Terminated)
Feb 17 17:32:20 compute-0 neutron-haproxy-ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237[216835]: [WARNING]  (216867) : All workers exited. Exiting... (0)
Feb 17 17:32:20 compute-0 systemd[1]: libpod-9397233f1daf73350023369226a7720ba4dbd2fc568166cfbd9f81c414ffc285.scope: Deactivated successfully.
Feb 17 17:32:20 compute-0 podman[217279]: 2026-02-17 17:32:20.111984619 +0000 UTC m=+0.060433404 container died 9397233f1daf73350023369226a7720ba4dbd2fc568166cfbd9f81c414ffc285 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 17 17:32:20 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9397233f1daf73350023369226a7720ba4dbd2fc568166cfbd9f81c414ffc285-userdata-shm.mount: Deactivated successfully.
Feb 17 17:32:20 compute-0 kernel: tapc8432488-9d: entered promiscuous mode
Feb 17 17:32:20 compute-0 NetworkManager[56323]: <info>  [1771349540.1386] manager: (tapc8432488-9d): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Feb 17 17:32:20 compute-0 ovn_controller[96568]: 2026-02-17T17:32:20Z|00082|binding|INFO|Claiming lport c8432488-9dc7-47f2-88da-abda34b62ca5 for this chassis.
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.139 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:20 compute-0 ovn_controller[96568]: 2026-02-17T17:32:20Z|00083|binding|INFO|c8432488-9dc7-47f2-88da-abda34b62ca5: Claiming fa:16:3e:f2:86:09 10.100.0.4
Feb 17 17:32:20 compute-0 systemd-udevd[217257]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:32:20 compute-0 kernel: tapc8432488-9d (unregistering): left promiscuous mode
Feb 17 17:32:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-b44392ec6dad5b38bce18fbf931aa14fa64d0ecfd8c7a39abd0f23573b1bcd7b-merged.mount: Deactivated successfully.
Feb 17 17:32:20 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:20.151 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:86:09 10.100.0.4'], port_security=['fa:16:3e:f2:86:09 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'af4e641d-d312-4f39-878b-fd7ddc3984df', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ea18f88-05f8-477a-96b4-268feae14237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '823b55fc-0d55-46d3-955d-e46832564672', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d03eeed7-622b-4037-b9a8-17a167c6170b, chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=c8432488-9dc7-47f2-88da-abda34b62ca5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:32:20 compute-0 ovn_controller[96568]: 2026-02-17T17:32:20Z|00084|binding|INFO|Setting lport c8432488-9dc7-47f2-88da-abda34b62ca5 ovn-installed in OVS
Feb 17 17:32:20 compute-0 ovn_controller[96568]: 2026-02-17T17:32:20Z|00085|binding|INFO|Setting lport c8432488-9dc7-47f2-88da-abda34b62ca5 up in Southbound
Feb 17 17:32:20 compute-0 ovn_controller[96568]: 2026-02-17T17:32:20Z|00086|binding|INFO|Releasing lport c8432488-9dc7-47f2-88da-abda34b62ca5 from this chassis (sb_readonly=1)
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.158 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:20 compute-0 ovn_controller[96568]: 2026-02-17T17:32:20Z|00087|if_status|INFO|Not setting lport c8432488-9dc7-47f2-88da-abda34b62ca5 down as sb is readonly
Feb 17 17:32:20 compute-0 podman[217279]: 2026-02-17 17:32:20.159575986 +0000 UTC m=+0.108024781 container cleanup 9397233f1daf73350023369226a7720ba4dbd2fc568166cfbd9f81c414ffc285 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 17 17:32:20 compute-0 ovn_controller[96568]: 2026-02-17T17:32:20Z|00088|binding|INFO|Removing iface tapc8432488-9d ovn-installed in OVS
Feb 17 17:32:20 compute-0 ovn_controller[96568]: 2026-02-17T17:32:20Z|00089|binding|INFO|Releasing lport c8432488-9dc7-47f2-88da-abda34b62ca5 from this chassis (sb_readonly=1)
Feb 17 17:32:20 compute-0 ovn_controller[96568]: 2026-02-17T17:32:20Z|00090|binding|INFO|Setting lport c8432488-9dc7-47f2-88da-abda34b62ca5 down in Southbound
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.166 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:20 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:20.168 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:86:09 10.100.0.4'], port_security=['fa:16:3e:f2:86:09 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'af4e641d-d312-4f39-878b-fd7ddc3984df', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ea18f88-05f8-477a-96b4-268feae14237', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '823b55fc-0d55-46d3-955d-e46832564672', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d03eeed7-622b-4037-b9a8-17a167c6170b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=c8432488-9dc7-47f2-88da-abda34b62ca5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:32:20 compute-0 systemd[1]: libpod-conmon-9397233f1daf73350023369226a7720ba4dbd2fc568166cfbd9f81c414ffc285.scope: Deactivated successfully.
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.188 186483 INFO nova.virt.libvirt.driver [-] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Instance destroyed successfully.
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.189 186483 DEBUG nova.objects.instance [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'resources' on Instance uuid af4e641d-d312-4f39-878b-fd7ddc3984df obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.201 186483 DEBUG nova.virt.libvirt.vif [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:31:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-972534764',display_name='tempest-TestNetworkBasicOps-server-972534764',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-972534764',id=4,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKif6TcgbUhwZNSHA/7Zqj2ckTW5gy38G2PEojWfF3Gkej5Cj+1R9oPgDqLWMFj0xHsnWUhwl7YVr/EP5RRCVJHFnJIkLEAALNr4nUjt1M4lQ7PTSk0Axc9skXgk3GMq2g==',key_name='tempest-TestNetworkBasicOps-1576131339',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:31:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-n1mkh1rf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:31:22Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=af4e641d-d312-4f39-878b-fd7ddc3984df,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c8432488-9dc7-47f2-88da-abda34b62ca5", "address": "fa:16:3e:f2:86:09", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8432488-9d", "ovs_interfaceid": "c8432488-9dc7-47f2-88da-abda34b62ca5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.202 186483 DEBUG nova.network.os_vif_util [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "c8432488-9dc7-47f2-88da-abda34b62ca5", "address": "fa:16:3e:f2:86:09", "network": {"id": "6ea18f88-05f8-477a-96b4-268feae14237", "bridge": "br-int", "label": "tempest-network-smoke--228843742", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8432488-9d", "ovs_interfaceid": "c8432488-9dc7-47f2-88da-abda34b62ca5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.202 186483 DEBUG nova.network.os_vif_util [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f2:86:09,bridge_name='br-int',has_traffic_filtering=True,id=c8432488-9dc7-47f2-88da-abda34b62ca5,network=Network(6ea18f88-05f8-477a-96b4-268feae14237),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8432488-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.203 186483 DEBUG os_vif [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:86:09,bridge_name='br-int',has_traffic_filtering=True,id=c8432488-9dc7-47f2-88da-abda34b62ca5,network=Network(6ea18f88-05f8-477a-96b4-268feae14237),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8432488-9d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.205 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.205 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8432488-9d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.207 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.208 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.211 186483 INFO os_vif [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:86:09,bridge_name='br-int',has_traffic_filtering=True,id=c8432488-9dc7-47f2-88da-abda34b62ca5,network=Network(6ea18f88-05f8-477a-96b4-268feae14237),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8432488-9d')
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.211 186483 INFO nova.virt.libvirt.driver [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Deleting instance files /var/lib/nova/instances/af4e641d-d312-4f39-878b-fd7ddc3984df_del
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.212 186483 INFO nova.virt.libvirt.driver [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Deletion of /var/lib/nova/instances/af4e641d-d312-4f39-878b-fd7ddc3984df_del complete
Feb 17 17:32:20 compute-0 podman[217317]: 2026-02-17 17:32:20.21883504 +0000 UTC m=+0.036862335 container remove 9397233f1daf73350023369226a7720ba4dbd2fc568166cfbd9f81c414ffc285 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:32:20 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:20.223 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[54b90c61-e85c-4c7f-86ea-14d8c785eaf2]: (4, ('Tue Feb 17 05:32:20 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237 (9397233f1daf73350023369226a7720ba4dbd2fc568166cfbd9f81c414ffc285)\n9397233f1daf73350023369226a7720ba4dbd2fc568166cfbd9f81c414ffc285\nTue Feb 17 05:32:20 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237 (9397233f1daf73350023369226a7720ba4dbd2fc568166cfbd9f81c414ffc285)\n9397233f1daf73350023369226a7720ba4dbd2fc568166cfbd9f81c414ffc285\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:20 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:20.225 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[b53fc4f0-fabb-443a-9b97-278f18aba9ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:20 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:20.225 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6ea18f88-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.227 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:20 compute-0 kernel: tap6ea18f88-00: left promiscuous mode
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.232 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:20 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:20.234 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[0cea544a-769b-4031-a62d-3037e2752044]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:20 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:20.245 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[f81d8f14-a11f-40c2-8485-7f7bb83803d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:20 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:20.247 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[1a8bae76-1b27-442e-a5f5-30a6e186d57c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.253 186483 INFO nova.compute.manager [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Took 0.33 seconds to destroy the instance on the hypervisor.
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.254 186483 DEBUG oslo.service.loopingcall [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.254 186483 DEBUG nova.compute.manager [-] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.254 186483 DEBUG nova.network.neutron [-] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 17 17:32:20 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:20.264 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[e28b4963-978d-4359-ad57-8ed66f15d432]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 323690, 'reachable_time': 25354, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217336, 'error': None, 'target': 'ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:20 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:20.267 106424 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6ea18f88-05f8-477a-96b4-268feae14237 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 17 17:32:20 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:20.267 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[b03da960-9146-4379-b672-a5eb19aee3e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:20 compute-0 systemd[1]: run-netns-ovnmeta\x2d6ea18f88\x2d05f8\x2d477a\x2d96b4\x2d268feae14237.mount: Deactivated successfully.
Feb 17 17:32:20 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:20.268 105898 INFO neutron.agent.ovn.metadata.agent [-] Port c8432488-9dc7-47f2-88da-abda34b62ca5 in datapath 6ea18f88-05f8-477a-96b4-268feae14237 unbound from our chassis
Feb 17 17:32:20 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:20.269 105898 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6ea18f88-05f8-477a-96b4-268feae14237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 17 17:32:20 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:20.270 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[869e7344-92e8-4b6b-a6a8-bce6ae3d2871]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:20 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:20.271 105898 INFO neutron.agent.ovn.metadata.agent [-] Port c8432488-9dc7-47f2-88da-abda34b62ca5 in datapath 6ea18f88-05f8-477a-96b4-268feae14237 unbound from our chassis
Feb 17 17:32:20 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:20.272 105898 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6ea18f88-05f8-477a-96b4-268feae14237, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 17 17:32:20 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:20.272 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[639449c8-23f3-4f9b-acee-3bba4b26b249]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.472 186483 DEBUG nova.compute.manager [req-ff87ec24-e64c-41c0-9717-e96038bb5c74 req-bda58bf0-3e94-432b-a767-a097cb592de5 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Received event network-vif-unplugged-c8432488-9dc7-47f2-88da-abda34b62ca5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.473 186483 DEBUG oslo_concurrency.lockutils [req-ff87ec24-e64c-41c0-9717-e96038bb5c74 req-bda58bf0-3e94-432b-a767-a097cb592de5 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "af4e641d-d312-4f39-878b-fd7ddc3984df-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.474 186483 DEBUG oslo_concurrency.lockutils [req-ff87ec24-e64c-41c0-9717-e96038bb5c74 req-bda58bf0-3e94-432b-a767-a097cb592de5 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "af4e641d-d312-4f39-878b-fd7ddc3984df-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.474 186483 DEBUG oslo_concurrency.lockutils [req-ff87ec24-e64c-41c0-9717-e96038bb5c74 req-bda58bf0-3e94-432b-a767-a097cb592de5 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "af4e641d-d312-4f39-878b-fd7ddc3984df-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.477 186483 DEBUG nova.compute.manager [req-ff87ec24-e64c-41c0-9717-e96038bb5c74 req-bda58bf0-3e94-432b-a767-a097cb592de5 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] No waiting events found dispatching network-vif-unplugged-c8432488-9dc7-47f2-88da-abda34b62ca5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.478 186483 DEBUG nova.compute.manager [req-ff87ec24-e64c-41c0-9717-e96038bb5c74 req-bda58bf0-3e94-432b-a767-a097cb592de5 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Received event network-vif-unplugged-c8432488-9dc7-47f2-88da-abda34b62ca5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.715 186483 DEBUG nova.network.neutron [-] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.730 186483 INFO nova.compute.manager [-] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Took 0.48 seconds to deallocate network for instance.
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.765 186483 DEBUG oslo_concurrency.lockutils [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.766 186483 DEBUG oslo_concurrency.lockutils [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.806 186483 DEBUG nova.compute.provider_tree [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.824 186483 DEBUG nova.scheduler.client.report [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.847 186483 DEBUG oslo_concurrency.lockutils [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.872 186483 INFO nova.scheduler.client.report [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Deleted allocations for instance af4e641d-d312-4f39-878b-fd7ddc3984df
Feb 17 17:32:20 compute-0 nova_compute[186479]: 2026-02-17 17:32:20.934 186483 DEBUG oslo_concurrency.lockutils [None req-2aa371a3-3ac6-4e38-9f47-18dcc1d40e59 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "af4e641d-d312-4f39-878b-fd7ddc3984df" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.018s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:32:21 compute-0 nova_compute[186479]: 2026-02-17 17:32:21.630 186483 DEBUG nova.compute.manager [req-861b0383-e0ec-4acc-942a-436573658a74 req-d55cc49d-fcdb-4b11-bdc1-8265a9ba05a7 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Received event network-vif-deleted-c8432488-9dc7-47f2-88da-abda34b62ca5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:32:22 compute-0 nova_compute[186479]: 2026-02-17 17:32:22.573 186483 DEBUG nova.compute.manager [req-b03d2e47-a7dc-4169-84a2-c89b3f39f4b3 req-27439038-eb27-4bb6-9b17-b883f2f114fc 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Received event network-vif-plugged-c8432488-9dc7-47f2-88da-abda34b62ca5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:32:22 compute-0 nova_compute[186479]: 2026-02-17 17:32:22.574 186483 DEBUG oslo_concurrency.lockutils [req-b03d2e47-a7dc-4169-84a2-c89b3f39f4b3 req-27439038-eb27-4bb6-9b17-b883f2f114fc 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "af4e641d-d312-4f39-878b-fd7ddc3984df-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:32:22 compute-0 nova_compute[186479]: 2026-02-17 17:32:22.574 186483 DEBUG oslo_concurrency.lockutils [req-b03d2e47-a7dc-4169-84a2-c89b3f39f4b3 req-27439038-eb27-4bb6-9b17-b883f2f114fc 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "af4e641d-d312-4f39-878b-fd7ddc3984df-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:32:22 compute-0 nova_compute[186479]: 2026-02-17 17:32:22.575 186483 DEBUG oslo_concurrency.lockutils [req-b03d2e47-a7dc-4169-84a2-c89b3f39f4b3 req-27439038-eb27-4bb6-9b17-b883f2f114fc 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "af4e641d-d312-4f39-878b-fd7ddc3984df-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:32:22 compute-0 nova_compute[186479]: 2026-02-17 17:32:22.575 186483 DEBUG nova.compute.manager [req-b03d2e47-a7dc-4169-84a2-c89b3f39f4b3 req-27439038-eb27-4bb6-9b17-b883f2f114fc 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] No waiting events found dispatching network-vif-plugged-c8432488-9dc7-47f2-88da-abda34b62ca5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:32:22 compute-0 nova_compute[186479]: 2026-02-17 17:32:22.576 186483 WARNING nova.compute.manager [req-b03d2e47-a7dc-4169-84a2-c89b3f39f4b3 req-27439038-eb27-4bb6-9b17-b883f2f114fc 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Received unexpected event network-vif-plugged-c8432488-9dc7-47f2-88da-abda34b62ca5 for instance with vm_state deleted and task_state None.
Feb 17 17:32:23 compute-0 nova_compute[186479]: 2026-02-17 17:32:23.571 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:23 compute-0 podman[217338]: 2026-02-17 17:32:23.729093349 +0000 UTC m=+0.065677642 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true)
Feb 17 17:32:23 compute-0 podman[217337]: 2026-02-17 17:32:23.729887449 +0000 UTC m=+0.063931180 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 17 17:32:25 compute-0 nova_compute[186479]: 2026-02-17 17:32:25.207 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:25 compute-0 nova_compute[186479]: 2026-02-17 17:32:25.942 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:25 compute-0 nova_compute[186479]: 2026-02-17 17:32:25.968 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:28 compute-0 nova_compute[186479]: 2026-02-17 17:32:28.574 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:28 compute-0 podman[217375]: 2026-02-17 17:32:28.702667416 +0000 UTC m=+0.047620270 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 17 17:32:29 compute-0 nova_compute[186479]: 2026-02-17 17:32:29.571 186483 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771349534.5693872, 9d30324a-c03a-43f6-a245-3ac4f0923693 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:32:29 compute-0 nova_compute[186479]: 2026-02-17 17:32:29.571 186483 INFO nova.compute.manager [-] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] VM Stopped (Lifecycle Event)
Feb 17 17:32:29 compute-0 nova_compute[186479]: 2026-02-17 17:32:29.595 186483 DEBUG nova.compute.manager [None req-5a073eac-ffe9-4ff5-8a7a-85d42ecb74e5 - - - - - -] [instance: 9d30324a-c03a-43f6-a245-3ac4f0923693] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:32:30 compute-0 nova_compute[186479]: 2026-02-17 17:32:30.245 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:33 compute-0 nova_compute[186479]: 2026-02-17 17:32:33.575 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:35 compute-0 nova_compute[186479]: 2026-02-17 17:32:35.187 186483 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771349540.1859615, af4e641d-d312-4f39-878b-fd7ddc3984df => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:32:35 compute-0 nova_compute[186479]: 2026-02-17 17:32:35.187 186483 INFO nova.compute.manager [-] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] VM Stopped (Lifecycle Event)
Feb 17 17:32:35 compute-0 nova_compute[186479]: 2026-02-17 17:32:35.209 186483 DEBUG nova.compute.manager [None req-89a1ea07-bf22-4f4b-94dd-ade2ff8e2f3d - - - - - -] [instance: af4e641d-d312-4f39-878b-fd7ddc3984df] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:32:35 compute-0 nova_compute[186479]: 2026-02-17 17:32:35.247 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:37 compute-0 nova_compute[186479]: 2026-02-17 17:32:37.627 186483 DEBUG oslo_concurrency.lockutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:32:37 compute-0 nova_compute[186479]: 2026-02-17 17:32:37.628 186483 DEBUG oslo_concurrency.lockutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:32:37 compute-0 nova_compute[186479]: 2026-02-17 17:32:37.645 186483 DEBUG nova.compute.manager [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 17 17:32:37 compute-0 nova_compute[186479]: 2026-02-17 17:32:37.711 186483 DEBUG oslo_concurrency.lockutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:32:37 compute-0 nova_compute[186479]: 2026-02-17 17:32:37.712 186483 DEBUG oslo_concurrency.lockutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:32:37 compute-0 nova_compute[186479]: 2026-02-17 17:32:37.720 186483 DEBUG nova.virt.hardware [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 17 17:32:37 compute-0 nova_compute[186479]: 2026-02-17 17:32:37.721 186483 INFO nova.compute.claims [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Claim successful on node compute-0.ctlplane.example.com
Feb 17 17:32:37 compute-0 podman[217400]: 2026-02-17 17:32:37.794157728 +0000 UTC m=+0.134548757 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:32:37 compute-0 nova_compute[186479]: 2026-02-17 17:32:37.857 186483 DEBUG nova.compute.provider_tree [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:32:37 compute-0 nova_compute[186479]: 2026-02-17 17:32:37.874 186483 DEBUG nova.scheduler.client.report [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:32:37 compute-0 nova_compute[186479]: 2026-02-17 17:32:37.898 186483 DEBUG oslo_concurrency.lockutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:32:37 compute-0 nova_compute[186479]: 2026-02-17 17:32:37.899 186483 DEBUG nova.compute.manager [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 17 17:32:37 compute-0 nova_compute[186479]: 2026-02-17 17:32:37.944 186483 DEBUG nova.compute.manager [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 17 17:32:37 compute-0 nova_compute[186479]: 2026-02-17 17:32:37.944 186483 DEBUG nova.network.neutron [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 17 17:32:37 compute-0 nova_compute[186479]: 2026-02-17 17:32:37.964 186483 INFO nova.virt.libvirt.driver [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 17 17:32:37 compute-0 nova_compute[186479]: 2026-02-17 17:32:37.979 186483 DEBUG nova.compute.manager [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.080 186483 DEBUG nova.compute.manager [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.081 186483 DEBUG nova.virt.libvirt.driver [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.082 186483 INFO nova.virt.libvirt.driver [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Creating image(s)
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.082 186483 DEBUG oslo_concurrency.lockutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "/var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.083 186483 DEBUG oslo_concurrency.lockutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.083 186483 DEBUG oslo_concurrency.lockutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.100 186483 DEBUG oslo_concurrency.processutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.168 186483 DEBUG oslo_concurrency.processutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.169 186483 DEBUG oslo_concurrency.lockutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.170 186483 DEBUG oslo_concurrency.lockutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.181 186483 DEBUG oslo_concurrency.processutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.258 186483 DEBUG oslo_concurrency.processutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.259 186483 DEBUG oslo_concurrency.processutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.330 186483 DEBUG oslo_concurrency.processutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk 1073741824" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.331 186483 DEBUG oslo_concurrency.lockutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.332 186483 DEBUG oslo_concurrency.processutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.349 186483 DEBUG nova.policy [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.387 186483 DEBUG oslo_concurrency.processutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.387 186483 DEBUG nova.virt.disk.api [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Checking if we can resize image /var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.388 186483 DEBUG oslo_concurrency.processutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.454 186483 DEBUG oslo_concurrency.processutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.455 186483 DEBUG nova.virt.disk.api [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Cannot resize image /var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.455 186483 DEBUG nova.objects.instance [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'migration_context' on Instance uuid 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.469 186483 DEBUG nova.virt.libvirt.driver [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.470 186483 DEBUG nova.virt.libvirt.driver [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Ensure instance console log exists: /var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.471 186483 DEBUG oslo_concurrency.lockutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.472 186483 DEBUG oslo_concurrency.lockutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.472 186483 DEBUG oslo_concurrency.lockutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.577 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:38 compute-0 nova_compute[186479]: 2026-02-17 17:32:38.978 186483 DEBUG nova.network.neutron [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Successfully created port: d586c1db-aaaa-45aa-8c55-c3a66e387a6e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 17 17:32:39 compute-0 nova_compute[186479]: 2026-02-17 17:32:39.832 186483 DEBUG nova.network.neutron [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Successfully updated port: d586c1db-aaaa-45aa-8c55-c3a66e387a6e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 17 17:32:39 compute-0 nova_compute[186479]: 2026-02-17 17:32:39.853 186483 DEBUG oslo_concurrency.lockutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:32:39 compute-0 nova_compute[186479]: 2026-02-17 17:32:39.853 186483 DEBUG oslo_concurrency.lockutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquired lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:32:39 compute-0 nova_compute[186479]: 2026-02-17 17:32:39.853 186483 DEBUG nova.network.neutron [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 17 17:32:39 compute-0 nova_compute[186479]: 2026-02-17 17:32:39.932 186483 DEBUG nova.compute.manager [req-3f88953e-74b8-4d5d-81fc-446e3c3851e9 req-d1d572ea-04ad-4fa7-9099-c2937b4330b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Received event network-changed-d586c1db-aaaa-45aa-8c55-c3a66e387a6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:32:39 compute-0 nova_compute[186479]: 2026-02-17 17:32:39.933 186483 DEBUG nova.compute.manager [req-3f88953e-74b8-4d5d-81fc-446e3c3851e9 req-d1d572ea-04ad-4fa7-9099-c2937b4330b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Refreshing instance network info cache due to event network-changed-d586c1db-aaaa-45aa-8c55-c3a66e387a6e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:32:39 compute-0 nova_compute[186479]: 2026-02-17 17:32:39.933 186483 DEBUG oslo_concurrency.lockutils [req-3f88953e-74b8-4d5d-81fc-446e3c3851e9 req-d1d572ea-04ad-4fa7-9099-c2937b4330b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:32:39 compute-0 nova_compute[186479]: 2026-02-17 17:32:39.999 186483 DEBUG nova.network.neutron [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.291 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:40 compute-0 podman[217441]: 2026-02-17 17:32:40.741901469 +0000 UTC m=+0.078038373 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.874 186483 DEBUG nova.network.neutron [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Updating instance_info_cache with network_info: [{"id": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "address": "fa:16:3e:71:0b:7d", "network": {"id": "4bf50377-716d-42af-ab2c-e962c79e0a2f", "bridge": "br-int", "label": "tempest-network-smoke--1389548266", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd586c1db-aa", "ovs_interfaceid": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.893 186483 DEBUG oslo_concurrency.lockutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Releasing lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.893 186483 DEBUG nova.compute.manager [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Instance network_info: |[{"id": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "address": "fa:16:3e:71:0b:7d", "network": {"id": "4bf50377-716d-42af-ab2c-e962c79e0a2f", "bridge": "br-int", "label": "tempest-network-smoke--1389548266", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd586c1db-aa", "ovs_interfaceid": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.894 186483 DEBUG oslo_concurrency.lockutils [req-3f88953e-74b8-4d5d-81fc-446e3c3851e9 req-d1d572ea-04ad-4fa7-9099-c2937b4330b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.894 186483 DEBUG nova.network.neutron [req-3f88953e-74b8-4d5d-81fc-446e3c3851e9 req-d1d572ea-04ad-4fa7-9099-c2937b4330b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Refreshing network info cache for port d586c1db-aaaa-45aa-8c55-c3a66e387a6e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.900 186483 DEBUG nova.virt.libvirt.driver [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Start _get_guest_xml network_info=[{"id": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "address": "fa:16:3e:71:0b:7d", "network": {"id": "4bf50377-716d-42af-ab2c-e962c79e0a2f", "bridge": "br-int", "label": "tempest-network-smoke--1389548266", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd586c1db-aa", "ovs_interfaceid": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_format': None, 'encryption_options': None, 'size': 0, 'encrypted': False, 'image_id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.906 186483 WARNING nova.virt.libvirt.driver [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.931 186483 DEBUG nova.virt.libvirt.host [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.931 186483 DEBUG nova.virt.libvirt.host [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.935 186483 DEBUG nova.virt.libvirt.host [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.936 186483 DEBUG nova.virt.libvirt.host [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.937 186483 DEBUG nova.virt.libvirt.driver [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.937 186483 DEBUG nova.virt.hardware [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-17T17:28:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='5ebd41ec-8360-4181-bce3-0c0dc586cdb2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.938 186483 DEBUG nova.virt.hardware [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.939 186483 DEBUG nova.virt.hardware [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.939 186483 DEBUG nova.virt.hardware [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.940 186483 DEBUG nova.virt.hardware [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.940 186483 DEBUG nova.virt.hardware [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.940 186483 DEBUG nova.virt.hardware [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.941 186483 DEBUG nova.virt.hardware [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.941 186483 DEBUG nova.virt.hardware [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.942 186483 DEBUG nova.virt.hardware [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.942 186483 DEBUG nova.virt.hardware [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.949 186483 DEBUG nova.virt.libvirt.vif [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:32:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2104168838',display_name='tempest-TestNetworkBasicOps-server-2104168838',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2104168838',id=6,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEGSEk4tvtHx+1wxTUolBp5FEKwhcwrFkbyGug9B2dJQk+Gxis2F5tzoHoKU+6EzBSJMG7knNQ1i0UckkdjQ1ANkfH3T0OzsYfzd7j6wv5dyVb9QJBpDJuL+Kt9oifXUkw==',key_name='tempest-TestNetworkBasicOps-347025944',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-vfuyzqzc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:32:38Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=3f9e4572-dddc-48ee-8ecf-d52e23be5e3a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "address": "fa:16:3e:71:0b:7d", "network": {"id": "4bf50377-716d-42af-ab2c-e962c79e0a2f", "bridge": "br-int", "label": "tempest-network-smoke--1389548266", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd586c1db-aa", "ovs_interfaceid": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.950 186483 DEBUG nova.network.os_vif_util [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "address": "fa:16:3e:71:0b:7d", "network": {"id": "4bf50377-716d-42af-ab2c-e962c79e0a2f", "bridge": "br-int", "label": "tempest-network-smoke--1389548266", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd586c1db-aa", "ovs_interfaceid": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.951 186483 DEBUG nova.network.os_vif_util [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:0b:7d,bridge_name='br-int',has_traffic_filtering=True,id=d586c1db-aaaa-45aa-8c55-c3a66e387a6e,network=Network(4bf50377-716d-42af-ab2c-e962c79e0a2f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd586c1db-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:32:40 compute-0 nova_compute[186479]: 2026-02-17 17:32:40.953 186483 DEBUG nova.objects.instance [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'pci_devices' on Instance uuid 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.092 186483 DEBUG nova.virt.libvirt.driver [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] End _get_guest_xml xml=<domain type="kvm">
Feb 17 17:32:41 compute-0 nova_compute[186479]:   <uuid>3f9e4572-dddc-48ee-8ecf-d52e23be5e3a</uuid>
Feb 17 17:32:41 compute-0 nova_compute[186479]:   <name>instance-00000006</name>
Feb 17 17:32:41 compute-0 nova_compute[186479]:   <memory>131072</memory>
Feb 17 17:32:41 compute-0 nova_compute[186479]:   <vcpu>1</vcpu>
Feb 17 17:32:41 compute-0 nova_compute[186479]:   <metadata>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <nova:name>tempest-TestNetworkBasicOps-server-2104168838</nova:name>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <nova:creationTime>2026-02-17 17:32:40</nova:creationTime>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <nova:flavor name="m1.nano">
Feb 17 17:32:41 compute-0 nova_compute[186479]:         <nova:memory>128</nova:memory>
Feb 17 17:32:41 compute-0 nova_compute[186479]:         <nova:disk>1</nova:disk>
Feb 17 17:32:41 compute-0 nova_compute[186479]:         <nova:swap>0</nova:swap>
Feb 17 17:32:41 compute-0 nova_compute[186479]:         <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:32:41 compute-0 nova_compute[186479]:         <nova:vcpus>1</nova:vcpus>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       </nova:flavor>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <nova:owner>
Feb 17 17:32:41 compute-0 nova_compute[186479]:         <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:32:41 compute-0 nova_compute[186479]:         <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       </nova:owner>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <nova:ports>
Feb 17 17:32:41 compute-0 nova_compute[186479]:         <nova:port uuid="d586c1db-aaaa-45aa-8c55-c3a66e387a6e">
Feb 17 17:32:41 compute-0 nova_compute[186479]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:         </nova:port>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       </nova:ports>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     </nova:instance>
Feb 17 17:32:41 compute-0 nova_compute[186479]:   </metadata>
Feb 17 17:32:41 compute-0 nova_compute[186479]:   <sysinfo type="smbios">
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <system>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <entry name="manufacturer">RDO</entry>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <entry name="product">OpenStack Compute</entry>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <entry name="serial">3f9e4572-dddc-48ee-8ecf-d52e23be5e3a</entry>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <entry name="uuid">3f9e4572-dddc-48ee-8ecf-d52e23be5e3a</entry>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <entry name="family">Virtual Machine</entry>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     </system>
Feb 17 17:32:41 compute-0 nova_compute[186479]:   </sysinfo>
Feb 17 17:32:41 compute-0 nova_compute[186479]:   <os>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <boot dev="hd"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <smbios mode="sysinfo"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:   </os>
Feb 17 17:32:41 compute-0 nova_compute[186479]:   <features>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <acpi/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <apic/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <vmcoreinfo/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:   </features>
Feb 17 17:32:41 compute-0 nova_compute[186479]:   <clock offset="utc">
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <timer name="pit" tickpolicy="delay"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <timer name="hpet" present="no"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:   </clock>
Feb 17 17:32:41 compute-0 nova_compute[186479]:   <cpu mode="host-model" match="exact">
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <topology sockets="1" cores="1" threads="1"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:32:41 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <disk type="file" device="disk">
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <target dev="vda" bus="virtio"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <disk type="file" device="cdrom">
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <driver name="qemu" type="raw" cache="none"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.config"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <target dev="sda" bus="sata"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <interface type="ethernet">
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <mac address="fa:16:3e:71:0b:7d"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <driver name="vhost" rx_queue_size="512"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <mtu size="1442"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <target dev="tapd586c1db-aa"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <serial type="pty">
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <log file="/var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/console.log" append="off"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     </serial>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <video>
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     </video>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <input type="tablet" bus="usb"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <rng model="virtio">
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <backend model="random">/dev/urandom</backend>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <controller type="usb" index="0"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     <memballoon model="virtio">
Feb 17 17:32:41 compute-0 nova_compute[186479]:       <stats period="10"/>
Feb 17 17:32:41 compute-0 nova_compute[186479]:     </memballoon>
Feb 17 17:32:41 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:32:41 compute-0 nova_compute[186479]: </domain>
Feb 17 17:32:41 compute-0 nova_compute[186479]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.094 186483 DEBUG nova.compute.manager [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Preparing to wait for external event network-vif-plugged-d586c1db-aaaa-45aa-8c55-c3a66e387a6e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.095 186483 DEBUG oslo_concurrency.lockutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.095 186483 DEBUG oslo_concurrency.lockutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.095 186483 DEBUG oslo_concurrency.lockutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.097 186483 DEBUG nova.virt.libvirt.vif [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:32:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2104168838',display_name='tempest-TestNetworkBasicOps-server-2104168838',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2104168838',id=6,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEGSEk4tvtHx+1wxTUolBp5FEKwhcwrFkbyGug9B2dJQk+Gxis2F5tzoHoKU+6EzBSJMG7knNQ1i0UckkdjQ1ANkfH3T0OzsYfzd7j6wv5dyVb9QJBpDJuL+Kt9oifXUkw==',key_name='tempest-TestNetworkBasicOps-347025944',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-vfuyzqzc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:32:38Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=3f9e4572-dddc-48ee-8ecf-d52e23be5e3a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "address": "fa:16:3e:71:0b:7d", "network": {"id": "4bf50377-716d-42af-ab2c-e962c79e0a2f", "bridge": "br-int", "label": "tempest-network-smoke--1389548266", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd586c1db-aa", "ovs_interfaceid": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.097 186483 DEBUG nova.network.os_vif_util [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "address": "fa:16:3e:71:0b:7d", "network": {"id": "4bf50377-716d-42af-ab2c-e962c79e0a2f", "bridge": "br-int", "label": "tempest-network-smoke--1389548266", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd586c1db-aa", "ovs_interfaceid": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.098 186483 DEBUG nova.network.os_vif_util [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:0b:7d,bridge_name='br-int',has_traffic_filtering=True,id=d586c1db-aaaa-45aa-8c55-c3a66e387a6e,network=Network(4bf50377-716d-42af-ab2c-e962c79e0a2f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd586c1db-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.099 186483 DEBUG os_vif [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:0b:7d,bridge_name='br-int',has_traffic_filtering=True,id=d586c1db-aaaa-45aa-8c55-c3a66e387a6e,network=Network(4bf50377-716d-42af-ab2c-e962c79e0a2f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd586c1db-aa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.100 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.100 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.101 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.106 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.106 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd586c1db-aa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.107 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd586c1db-aa, col_values=(('external_ids', {'iface-id': 'd586c1db-aaaa-45aa-8c55-c3a66e387a6e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:71:0b:7d', 'vm-uuid': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.108 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:41 compute-0 NetworkManager[56323]: <info>  [1771349561.1094] manager: (tapd586c1db-aa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.111 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.114 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.114 186483 INFO os_vif [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:0b:7d,bridge_name='br-int',has_traffic_filtering=True,id=d586c1db-aaaa-45aa-8c55-c3a66e387a6e,network=Network(4bf50377-716d-42af-ab2c-e962c79e0a2f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd586c1db-aa')
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.174 186483 DEBUG nova.virt.libvirt.driver [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.174 186483 DEBUG nova.virt.libvirt.driver [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.175 186483 DEBUG nova.virt.libvirt.driver [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No VIF found with MAC fa:16:3e:71:0b:7d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.175 186483 INFO nova.virt.libvirt.driver [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Using config drive
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.461 186483 INFO nova.virt.libvirt.driver [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Creating config drive at /var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.config
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.465 186483 DEBUG oslo_concurrency.processutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp7oga9254 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.593 186483 DEBUG oslo_concurrency.processutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp7oga9254" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:32:41 compute-0 kernel: tapd586c1db-aa: entered promiscuous mode
Feb 17 17:32:41 compute-0 NetworkManager[56323]: <info>  [1771349561.6509] manager: (tapd586c1db-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Feb 17 17:32:41 compute-0 ovn_controller[96568]: 2026-02-17T17:32:41Z|00091|binding|INFO|Claiming lport d586c1db-aaaa-45aa-8c55-c3a66e387a6e for this chassis.
Feb 17 17:32:41 compute-0 ovn_controller[96568]: 2026-02-17T17:32:41Z|00092|binding|INFO|d586c1db-aaaa-45aa-8c55-c3a66e387a6e: Claiming fa:16:3e:71:0b:7d 10.100.0.10
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.651 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.658 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.664 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.674 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:0b:7d 10.100.0.10'], port_security=['fa:16:3e:71:0b:7d 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4bf50377-716d-42af-ab2c-e962c79e0a2f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4c2aaec3-a759-4c1d-aa84-8bc6ccd10feb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e1d0dffc-0ac5-44da-8066-c0d43978c220, chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=d586c1db-aaaa-45aa-8c55-c3a66e387a6e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.675 105898 INFO neutron.agent.ovn.metadata.agent [-] Port d586c1db-aaaa-45aa-8c55-c3a66e387a6e in datapath 4bf50377-716d-42af-ab2c-e962c79e0a2f bound to our chassis
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.677 105898 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4bf50377-716d-42af-ab2c-e962c79e0a2f
Feb 17 17:32:41 compute-0 ovn_controller[96568]: 2026-02-17T17:32:41Z|00093|binding|INFO|Setting lport d586c1db-aaaa-45aa-8c55-c3a66e387a6e ovn-installed in OVS
Feb 17 17:32:41 compute-0 ovn_controller[96568]: 2026-02-17T17:32:41Z|00094|binding|INFO|Setting lport d586c1db-aaaa-45aa-8c55-c3a66e387a6e up in Southbound
Feb 17 17:32:41 compute-0 systemd-machined[155877]: New machine qemu-6-instance-00000006.
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.686 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.691 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[75be97e0-e098-4fe1-b641-25c3687bb98d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.692 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4bf50377-71 in ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.694 215208 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4bf50377-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.694 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[c6d67a17-4a81-457c-84e9-6f1dfa3e7c3c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.695 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[ceeef63c-c9e2-4072-a9b0-c9bd69c95ebc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:41 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000006.
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.706 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[2e532a7e-56e7-438d-96cb-2f13a9e250ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:41 compute-0 systemd-udevd[217488]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.717 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[246341d5-6bde-4b28-88ca-34c0bc0bcd01]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:41 compute-0 NetworkManager[56323]: <info>  [1771349561.7295] device (tapd586c1db-aa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 17 17:32:41 compute-0 NetworkManager[56323]: <info>  [1771349561.7310] device (tapd586c1db-aa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.739 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[02d27edf-0081-47d5-90f8-6d71fbead79a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:41 compute-0 systemd-udevd[217493]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.745 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[0c7c5af1-a605-487d-b058-29800efb54c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:41 compute-0 NetworkManager[56323]: <info>  [1771349561.7468] manager: (tap4bf50377-70): new Veth device (/org/freedesktop/NetworkManager/Devices/52)
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.773 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[76d6b39a-f572-4bef-9562-3304b1378c76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.776 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[5ea45791-58b8-4a76-88be-1d18ae1b8424]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:41 compute-0 NetworkManager[56323]: <info>  [1771349561.8003] device (tap4bf50377-70): carrier: link connected
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.804 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[32d56772-14bb-4b0e-ab81-2eb5a8741295]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.820 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[a6fbe5be-2b5e-4ec4-a438-804232a4fbba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4bf50377-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:03:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 331711, 'reachable_time': 25129, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217519, 'error': None, 'target': 'ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.831 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[38304fd2-b8fb-4848-b2a7-1418d3310307]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefc:33e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 331711, 'tstamp': 331711}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217520, 'error': None, 'target': 'ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.847 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb33617-dd24-49d6-ac48-e6081bb05de6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4bf50377-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:03:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 331711, 'reachable_time': 25129, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217521, 'error': None, 'target': 'ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.872 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[f8794e4b-f8ae-42e9-a087-fe4a017933bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.924 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[eba55511-2cf5-4441-8161-0d6e1eae7236]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.926 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4bf50377-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.926 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.927 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4bf50377-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.928 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:41 compute-0 NetworkManager[56323]: <info>  [1771349561.9296] manager: (tap4bf50377-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Feb 17 17:32:41 compute-0 kernel: tap4bf50377-70: entered promiscuous mode
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.931 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.932 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4bf50377-70, col_values=(('external_ids', {'iface-id': '4efe1491-cdee-41bd-a179-ae1b240d0928'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.933 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:41 compute-0 ovn_controller[96568]: 2026-02-17T17:32:41Z|00095|binding|INFO|Releasing lport 4efe1491-cdee-41bd-a179-ae1b240d0928 from this chassis (sb_readonly=0)
Feb 17 17:32:41 compute-0 nova_compute[186479]: 2026-02-17 17:32:41.941 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.942 105898 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4bf50377-716d-42af-ab2c-e962c79e0a2f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4bf50377-716d-42af-ab2c-e962c79e0a2f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.943 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[7c0949b1-5463-4e2a-b40b-7c09b413d777]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.944 105898 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: global
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:     log         /dev/log local0 debug
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:     log-tag     haproxy-metadata-proxy-4bf50377-716d-42af-ab2c-e962c79e0a2f
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:     user        root
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:     group       root
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:     maxconn     1024
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:     pidfile     /var/lib/neutron/external/pids/4bf50377-716d-42af-ab2c-e962c79e0a2f.pid.haproxy
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:     daemon
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: defaults
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:     log global
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:     mode http
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:     option httplog
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:     option dontlognull
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:     option http-server-close
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:     option forwardfor
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:     retries                 3
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:     timeout http-request    30s
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:     timeout connect         30s
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:     timeout client          32s
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:     timeout server          32s
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:     timeout http-keep-alive 30s
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: listen listener
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:     bind 169.254.169.254:80
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:     server metadata /var/lib/neutron/metadata_proxy
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:     http-request add-header X-OVN-Network-ID 4bf50377-716d-42af-ab2c-e962c79e0a2f
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 17 17:32:41 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:41.945 105898 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f', 'env', 'PROCESS_TAG=haproxy-4bf50377-716d-42af-ab2c-e962c79e0a2f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4bf50377-716d-42af-ab2c-e962c79e0a2f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.011 186483 DEBUG nova.compute.manager [req-58f2d4bd-9105-442f-906a-95f9792c7be7 req-36902922-bce7-4a68-acdb-cb6cfe391d94 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Received event network-vif-plugged-d586c1db-aaaa-45aa-8c55-c3a66e387a6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.011 186483 DEBUG oslo_concurrency.lockutils [req-58f2d4bd-9105-442f-906a-95f9792c7be7 req-36902922-bce7-4a68-acdb-cb6cfe391d94 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.017 186483 DEBUG oslo_concurrency.lockutils [req-58f2d4bd-9105-442f-906a-95f9792c7be7 req-36902922-bce7-4a68-acdb-cb6cfe391d94 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.017 186483 DEBUG oslo_concurrency.lockutils [req-58f2d4bd-9105-442f-906a-95f9792c7be7 req-36902922-bce7-4a68-acdb-cb6cfe391d94 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.017 186483 DEBUG nova.compute.manager [req-58f2d4bd-9105-442f-906a-95f9792c7be7 req-36902922-bce7-4a68-acdb-cb6cfe391d94 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Processing event network-vif-plugged-d586c1db-aaaa-45aa-8c55-c3a66e387a6e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.049 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349562.0491471, 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.050 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] VM Started (Lifecycle Event)
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.053 186483 DEBUG nova.compute.manager [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.057 186483 DEBUG nova.virt.libvirt.driver [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.060 186483 INFO nova.virt.libvirt.driver [-] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Instance spawned successfully.
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.061 186483 DEBUG nova.virt.libvirt.driver [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.085 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.090 186483 DEBUG nova.virt.libvirt.driver [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.090 186483 DEBUG nova.virt.libvirt.driver [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.091 186483 DEBUG nova.virt.libvirt.driver [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.091 186483 DEBUG nova.virt.libvirt.driver [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.091 186483 DEBUG nova.virt.libvirt.driver [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.092 186483 DEBUG nova.virt.libvirt.driver [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.096 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.132 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.133 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349562.0493827, 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.133 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] VM Paused (Lifecycle Event)
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.154 186483 DEBUG nova.network.neutron [req-3f88953e-74b8-4d5d-81fc-446e3c3851e9 req-d1d572ea-04ad-4fa7-9099-c2937b4330b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Updated VIF entry in instance network info cache for port d586c1db-aaaa-45aa-8c55-c3a66e387a6e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.154 186483 DEBUG nova.network.neutron [req-3f88953e-74b8-4d5d-81fc-446e3c3851e9 req-d1d572ea-04ad-4fa7-9099-c2937b4330b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Updating instance_info_cache with network_info: [{"id": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "address": "fa:16:3e:71:0b:7d", "network": {"id": "4bf50377-716d-42af-ab2c-e962c79e0a2f", "bridge": "br-int", "label": "tempest-network-smoke--1389548266", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd586c1db-aa", "ovs_interfaceid": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.156 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.160 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349562.0558217, 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.160 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] VM Resumed (Lifecycle Event)
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.163 186483 INFO nova.compute.manager [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Took 4.08 seconds to spawn the instance on the hypervisor.
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.163 186483 DEBUG nova.compute.manager [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.179 186483 DEBUG oslo_concurrency.lockutils [req-3f88953e-74b8-4d5d-81fc-446e3c3851e9 req-d1d572ea-04ad-4fa7-9099-c2937b4330b8 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.193 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.196 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.217 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.227 186483 INFO nova.compute.manager [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Took 4.55 seconds to build instance.
Feb 17 17:32:42 compute-0 nova_compute[186479]: 2026-02-17 17:32:42.243 186483 DEBUG oslo_concurrency.lockutils [None req-49b02aa3-05bc-4ab2-9b50-28bc613c9af6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:32:42 compute-0 podman[217558]: 2026-02-17 17:32:42.351625187 +0000 UTC m=+0.092376433 container create d0133fbf82b0a247d1145430d220a30e8183cdbd971fff86c74b951abe83c055 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Feb 17 17:32:42 compute-0 podman[217558]: 2026-02-17 17:32:42.284836211 +0000 UTC m=+0.025587517 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 17 17:32:42 compute-0 systemd[1]: Started libpod-conmon-d0133fbf82b0a247d1145430d220a30e8183cdbd971fff86c74b951abe83c055.scope.
Feb 17 17:32:42 compute-0 systemd[1]: Started libcrun container.
Feb 17 17:32:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79767c4502347ad9025f7b377435d40f49f2296dd68c485cc8368fd78cb9d354/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 17 17:32:42 compute-0 podman[217558]: 2026-02-17 17:32:42.478200157 +0000 UTC m=+0.218951413 container init d0133fbf82b0a247d1145430d220a30e8183cdbd971fff86c74b951abe83c055 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Feb 17 17:32:42 compute-0 podman[217558]: 2026-02-17 17:32:42.485527296 +0000 UTC m=+0.226278522 container start d0133fbf82b0a247d1145430d220a30e8183cdbd971fff86c74b951abe83c055 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Feb 17 17:32:42 compute-0 neutron-haproxy-ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f[217573]: [NOTICE]   (217577) : New worker (217579) forked
Feb 17 17:32:42 compute-0 neutron-haproxy-ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f[217573]: [NOTICE]   (217577) : Loading success.
Feb 17 17:32:43 compute-0 nova_compute[186479]: 2026-02-17 17:32:43.579 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.717 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'name': 'tempest-TestNetworkBasicOps-server-2104168838', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000006', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'user_id': '3f041abe92134380b8de39091bce5989', 'hostId': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.717 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.720 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a / tapd586c1db-aa inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.720 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8df6e347-087b-4b08-9855-20954421ccbd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-00000006-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-tapd586c1db-aa', 'timestamp': '2026-02-17T17:32:43.718095', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'tapd586c1db-aa', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:71:0b:7d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd586c1db-aa'}, 'message_id': 'aab0f840-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.083084194, 'message_signature': 'f4dc995bf5b71c8e8352913f61a5a13e5a82c9c4d011a042bb0ee73cbb933531'}]}, 'timestamp': '2026-02-17 17:32:43.721353', '_unique_id': 'dee08a8daab042399311d4af03c4a2e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.722 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.723 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.723 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '999521f1-1be9-48b7-87f9-e1a8ceb26d66', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-00000006-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-tapd586c1db-aa', 'timestamp': '2026-02-17T17:32:43.723223', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'tapd586c1db-aa', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:71:0b:7d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd586c1db-aa'}, 'message_id': 'aab15024-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.083084194, 'message_signature': '226e7371f08e63781ac302f25efc6c418f4028bcb65032b3f5fba0d4ee9dd11d'}]}, 'timestamp': '2026-02-17 17:32:43.723553', '_unique_id': '9ce7d9afb1774f5c909c2a5d9744d2cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.724 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2104168838>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2104168838>]
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.725 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.753 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/cpu volume: 1600000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b2fed7b-7672-477a-b5cb-d618afdec436', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1600000000, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'timestamp': '2026-02-17T17:32:43.725127', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'instance-00000006', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'aab5f12e-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.11803206, 'message_signature': '04800b56dcb10c559a3fb9d6b8f93b377c5725578ba1d1206010ef6a2bd06724'}]}, 'timestamp': '2026-02-17 17:32:43.754152', '_unique_id': '548e53300a514a1cb92e2e1a0acefd3c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.755 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.757 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.757 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5eb5c530-889d-4219-899d-2fb6c794d825', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-00000006-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-tapd586c1db-aa', 'timestamp': '2026-02-17T17:32:43.757554', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'tapd586c1db-aa', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:71:0b:7d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd586c1db-aa'}, 'message_id': 'aab692e6-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.083084194, 'message_signature': '2825129765ee2b11dc70bf0fea5df362dea9a8dba578342755a949f811a494e0'}]}, 'timestamp': '2026-02-17 17:32:43.758165', '_unique_id': '7560e91f7c044431946fe56de3c5e77e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.759 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.760 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.790 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.device.read.latency volume: 454227989 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.790 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.device.read.latency volume: 23715400 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90effd66-f7ce-4783-99f2-f26a23e1c0b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 454227989, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-vda', 'timestamp': '2026-02-17T17:32:43.760828', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'instance-00000006', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aabb937c-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.125870102, 'message_signature': 'a1cc1fef7ada763ce64e0665ad96ead90975ab1daa9804174e2c4eecbbc71213'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23715400, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-sda', 'timestamp': '2026-02-17T17:32:43.760828', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'instance-00000006', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aabba7ae-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.125870102, 'message_signature': '64d62331603fadf2b37d22f411c011f88d989d81a84fc9a35d735f45e5f3b5f9'}]}, 'timestamp': '2026-02-17 17:32:43.791355', '_unique_id': '0626823b3d254fa3806281fa41d6b075'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.792 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.793 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.793 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44ab6b5b-c122-4f25-9dc6-46ec14120874', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-00000006-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-tapd586c1db-aa', 'timestamp': '2026-02-17T17:32:43.793950', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'tapd586c1db-aa', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:71:0b:7d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd586c1db-aa'}, 'message_id': 'aabc1e3c-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.083084194, 'message_signature': 'b7403a5f16bec10fa0c5281337f18b94ca76af20e9289508cf79ca4338ff0350'}]}, 'timestamp': '2026-02-17 17:32:43.794389', '_unique_id': '0bb337b15eaa4615b0840ad2dce5ae75'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.795 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.796 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.796 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a5dc1b9-8fd7-4a78-bbe5-e6716c00ac25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-00000006-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-tapd586c1db-aa', 'timestamp': '2026-02-17T17:32:43.796512', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'tapd586c1db-aa', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:71:0b:7d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd586c1db-aa'}, 'message_id': 'aabc8016-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.083084194, 'message_signature': '487ca487d80adaf1de776ff2a97798444fe927f6d56996b6aadd14b3ec534464'}]}, 'timestamp': '2026-02-17 17:32:43.796857', '_unique_id': 'eefedde8e799493a9a9157c27ef783a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.797 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.798 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.798 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.798 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2104168838>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2104168838>]
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.799 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.799 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.799 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2104168838>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2104168838>]
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.799 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.811 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.812 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb419da3-b43f-4275-bbd9-b0a4c0760e23', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-vda', 'timestamp': '2026-02-17T17:32:43.799724', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'instance-00000006', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aabecc36-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.164776235, 'message_signature': 'acd10afc264dfce3a9eb14385cdb2205f425d2727b3e94414d520043e9626e5e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-sda', 'timestamp': '2026-02-17T17:32:43.799724', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'instance-00000006', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aabee2fc-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.164776235, 'message_signature': '81ce72da687e611c945fb5e1d9d94b6a70b6dd85cb4fe86d33689e00d46ab88a'}]}, 'timestamp': '2026-02-17 17:32:43.812675', '_unique_id': 'b906cc5c8e0c447a90400eff4c131fff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.814 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.815 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.815 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.816 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00d4a98b-7d62-490b-b219-7e9bd3305b86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-vda', 'timestamp': '2026-02-17T17:32:43.815695', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'instance-00000006', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aabf71cc-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.164776235, 'message_signature': '567dc3fb0775993e41a236f3c7ed82a779fd741bd4729e8a0a7f858d10e2b427'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-sda', 'timestamp': '2026-02-17T17:32:43.815695', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'instance-00000006', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aabf84f0-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.164776235, 'message_signature': 'e68596bda561f72363021888846f12781789e36f3bd67eff03d589340626df94'}]}, 'timestamp': '2026-02-17 17:32:43.816709', '_unique_id': 'c5d203bed3024e77a4a859e93504185f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.817 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.818 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.819 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.819 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c54ccb0a-6b4c-41a4-9e4c-f2285cad1819', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-vda', 'timestamp': '2026-02-17T17:32:43.819146', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'instance-00000006', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aabffa34-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.125870102, 'message_signature': 'a6ff2df421cc62906f22ee0dda8a4b8c522468666a2712177442aeaa82e889e8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-sda', 'timestamp': '2026-02-17T17:32:43.819146', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'instance-00000006', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aac00d62-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.125870102, 'message_signature': '3381cad35ca257cf4737e7acd9aff86b78bd9a4521de0390bb37a2a726822548'}]}, 'timestamp': '2026-02-17 17:32:43.820239', '_unique_id': '1d2347cfdf4845b19c52008c18e5f550'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.821 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.822 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.822 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f494e3f3-90c5-43b4-90c7-ea4b361b4b34', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-00000006-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-tapd586c1db-aa', 'timestamp': '2026-02-17T17:32:43.822543', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'tapd586c1db-aa', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:71:0b:7d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd586c1db-aa'}, 'message_id': 'aac08030-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.083084194, 'message_signature': '3d5e8a8409d6b86fe934d1f7ac547076950df2e53d5b34d5b4014b65b4142b6b'}]}, 'timestamp': '2026-02-17 17:32:43.823217', '_unique_id': 'c7a43c5ef1544337891dff712eeb1ba4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.823 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.824 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.824 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.825 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d2f27f2-9ad2-4bd9-aa92-85acbd5deaa7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-vda', 'timestamp': '2026-02-17T17:32:43.824711', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'instance-00000006', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aac0cf18-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.125870102, 'message_signature': 'f7e7881c664b885b13cdd6a7380b405ea308bd07e827741cc59b6c1ce0e63a65'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-sda', 'timestamp': '2026-02-17T17:32:43.824711', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'instance-00000006', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aac0dd78-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.125870102, 'message_signature': '63b0245a17f20844ae6b5b4208b5b6606ee85ba7ae5b6c980ca582a88a1f117f'}]}, 'timestamp': '2026-02-17 17:32:43.825457', '_unique_id': 'be5b5ca2101345948fa0f5c618376823'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.826 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.827 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25002c5a-cb18-4a3e-a685-2bb4e0878f0b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-00000006-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-tapd586c1db-aa', 'timestamp': '2026-02-17T17:32:43.826991', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'tapd586c1db-aa', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:71:0b:7d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd586c1db-aa'}, 'message_id': 'aac129fe-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.083084194, 'message_signature': '42edfd615a2e1af5110ee91db1c706c356d35007e31d857df4bea4f3dfb0d9c4'}]}, 'timestamp': '2026-02-17 17:32:43.827439', '_unique_id': '5aa19a6b8b024c9881dc3e13b2546afc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.828 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.829 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.829 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14ce1e30-8eda-4e22-9f16-6d792b52632f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-vda', 'timestamp': '2026-02-17T17:32:43.828964', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'instance-00000006', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aac17634-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.125870102, 'message_signature': '139d95b3896f0396f1bb61022698f50319963e4b2b79909c17c8d6f4085a01fd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-sda', 'timestamp': '2026-02-17T17:32:43.828964', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'instance-00000006', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aac18372-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.125870102, 'message_signature': '7aafc0cb3273ba319c8a9ad66f23cc0b7509db15254bf11f40e07d4373cfaec8'}]}, 'timestamp': '2026-02-17 17:32:43.829691', '_unique_id': '4b77f5a2f286469e8b5048e9b7524866'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.830 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.831 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.831 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.831 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f02c5bf-d109-49c1-b03e-49b0c2b01621', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-vda', 'timestamp': '2026-02-17T17:32:43.831269', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'instance-00000006', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aac1cf12-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.125870102, 'message_signature': 'cf1686307eae63beee8df8168963111f67181ca6daed9b63b71ed5669c99b4a1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-sda', 'timestamp': '2026-02-17T17:32:43.831269', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'instance-00000006', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aac1dc1e-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.125870102, 'message_signature': '62f0f2c36a751be7feeb05b44f58e8688fd00e9139aff743b400208bcf213750'}]}, 'timestamp': '2026-02-17 17:32:43.831959', '_unique_id': 'ea72886e762f457ea5979e5b7263ad4b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.832 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.833 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.833 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.833 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2104168838>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2104168838>]
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.834 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.834 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.834 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3c8cd82-bb37-4e87-8e5d-937f6f3015cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-vda', 'timestamp': '2026-02-17T17:32:43.834167', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'instance-00000006', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aac24050-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.164776235, 'message_signature': 'b725f22d7fd4b309cad6926ccc627da0bd7f922e2f9494344983a3cc19df4b32'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-sda', 'timestamp': '2026-02-17T17:32:43.834167', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'instance-00000006', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aac24c8a-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.164776235, 'message_signature': 'a31e6d5285b322264e5961aaa7486b3d5a6416271d75d5892860b36a7fadb035'}]}, 'timestamp': '2026-02-17 17:32:43.834868', '_unique_id': '8ed20f273f2746d388fa0a770e8ca942'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.835 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.836 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.836 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e88f2db6-a706-4b5e-a844-b561619f8282', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-00000006-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-tapd586c1db-aa', 'timestamp': '2026-02-17T17:32:43.836424', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'tapd586c1db-aa', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:71:0b:7d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd586c1db-aa'}, 'message_id': 'aac296c2-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.083084194, 'message_signature': '91bbf4a93a854754cddbd02fedb6ef69940b65584c00b6da6d51a6da19bde172'}]}, 'timestamp': '2026-02-17 17:32:43.836749', '_unique_id': 'f981a05ac89e43a88f2e0fda877b2f82'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.837 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.838 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.838 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.838 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ddf072b-496c-47b9-aec0-c57ad0028d95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-vda', 'timestamp': '2026-02-17T17:32:43.838422', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'instance-00000006', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aac2e528-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.125870102, 'message_signature': '0014d1af05bd74a9eb7642b7d2e45cc77a7b2b98b9bbf8d36295837c3f9a0e2f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-sda', 'timestamp': '2026-02-17T17:32:43.838422', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'instance-00000006', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aac2f0fe-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.125870102, 'message_signature': 'e0f6df497ec8279ba896c44896dabe98be03565fdb7f21fcd3e79609a55204c5'}]}, 'timestamp': '2026-02-17 17:32:43.839039', '_unique_id': 'cdea84cba0074abcbd67fed11e1d80e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.839 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.840 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.840 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.840 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a: ceilometer.compute.pollsters.NoVolumeException
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.840 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.840 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '485958cb-5b4c-4d08-9081-af6921a236b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-00000006-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-tapd586c1db-aa', 'timestamp': '2026-02-17T17:32:43.840912', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'tapd586c1db-aa', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:71:0b:7d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd586c1db-aa'}, 'message_id': 'aac346c6-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.083084194, 'message_signature': 'fc290349c3e04259c4d10b54e31f41ef727aa55f424bc72d58b2ee2483c97e35'}]}, 'timestamp': '2026-02-17 17:32:43.841256', '_unique_id': '187d6772f2af4a0bb82e7e413451d3f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.841 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.842 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.842 12 DEBUG ceilometer.compute.pollsters [-] 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '01d9972b-6689-483a-ac7b-2f1061eb7a2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-00000006-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-tapd586c1db-aa', 'timestamp': '2026-02-17T17:32:43.842672', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2104168838', 'name': 'tapd586c1db-aa', 'instance_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:71:0b:7d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd586c1db-aa'}, 'message_id': 'aac38ba4-0c26-11f1-ab0d-fa163e76883c', 'monotonic_time': 3319.083084194, 'message_signature': 'aa9107b1b583924c19cbde8a5ea509ebbcd1a44e6d75eef99ecb4c485b2425a5'}]}, 'timestamp': '2026-02-17 17:32:43.843016', '_unique_id': '2c9ffa61d93a4447808010c5eec771c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:32:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:32:43.843 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:32:44 compute-0 nova_compute[186479]: 2026-02-17 17:32:44.112 186483 DEBUG nova.compute.manager [req-45d0ae78-6bb3-46a8-8bde-7579720af902 req-1c1e401a-43d8-40dd-af99-3728f2b23945 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Received event network-vif-plugged-d586c1db-aaaa-45aa-8c55-c3a66e387a6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:32:44 compute-0 nova_compute[186479]: 2026-02-17 17:32:44.112 186483 DEBUG oslo_concurrency.lockutils [req-45d0ae78-6bb3-46a8-8bde-7579720af902 req-1c1e401a-43d8-40dd-af99-3728f2b23945 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:32:44 compute-0 nova_compute[186479]: 2026-02-17 17:32:44.113 186483 DEBUG oslo_concurrency.lockutils [req-45d0ae78-6bb3-46a8-8bde-7579720af902 req-1c1e401a-43d8-40dd-af99-3728f2b23945 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:32:44 compute-0 nova_compute[186479]: 2026-02-17 17:32:44.113 186483 DEBUG oslo_concurrency.lockutils [req-45d0ae78-6bb3-46a8-8bde-7579720af902 req-1c1e401a-43d8-40dd-af99-3728f2b23945 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:32:44 compute-0 nova_compute[186479]: 2026-02-17 17:32:44.113 186483 DEBUG nova.compute.manager [req-45d0ae78-6bb3-46a8-8bde-7579720af902 req-1c1e401a-43d8-40dd-af99-3728f2b23945 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] No waiting events found dispatching network-vif-plugged-d586c1db-aaaa-45aa-8c55-c3a66e387a6e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:32:44 compute-0 nova_compute[186479]: 2026-02-17 17:32:44.113 186483 WARNING nova.compute.manager [req-45d0ae78-6bb3-46a8-8bde-7579720af902 req-1c1e401a-43d8-40dd-af99-3728f2b23945 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Received unexpected event network-vif-plugged-d586c1db-aaaa-45aa-8c55-c3a66e387a6e for instance with vm_state active and task_state None.
Feb 17 17:32:45 compute-0 nova_compute[186479]: 2026-02-17 17:32:45.363 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:45 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:45.363 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '7a:bf:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:a1:c8:7d:f2:63'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:32:45 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:45.365 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 17 17:32:46 compute-0 nova_compute[186479]: 2026-02-17 17:32:46.109 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:46 compute-0 nova_compute[186479]: 2026-02-17 17:32:46.331 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:46 compute-0 NetworkManager[56323]: <info>  [1771349566.3321] manager: (patch-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Feb 17 17:32:46 compute-0 NetworkManager[56323]: <info>  [1771349566.3334] manager: (patch-br-int-to-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Feb 17 17:32:46 compute-0 ovn_controller[96568]: 2026-02-17T17:32:46Z|00096|binding|INFO|Releasing lport 4efe1491-cdee-41bd-a179-ae1b240d0928 from this chassis (sb_readonly=0)
Feb 17 17:32:46 compute-0 ovn_controller[96568]: 2026-02-17T17:32:46Z|00097|binding|INFO|Releasing lport 4efe1491-cdee-41bd-a179-ae1b240d0928 from this chassis (sb_readonly=0)
Feb 17 17:32:46 compute-0 nova_compute[186479]: 2026-02-17 17:32:46.339 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:46 compute-0 nova_compute[186479]: 2026-02-17 17:32:46.344 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:46 compute-0 nova_compute[186479]: 2026-02-17 17:32:46.579 186483 DEBUG nova.compute.manager [req-5fab0046-8a8b-46de-a1c5-3c434ac26b7c req-7068c859-adf7-4645-8610-fc5cc8480806 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Received event network-changed-d586c1db-aaaa-45aa-8c55-c3a66e387a6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:32:46 compute-0 nova_compute[186479]: 2026-02-17 17:32:46.580 186483 DEBUG nova.compute.manager [req-5fab0046-8a8b-46de-a1c5-3c434ac26b7c req-7068c859-adf7-4645-8610-fc5cc8480806 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Refreshing instance network info cache due to event network-changed-d586c1db-aaaa-45aa-8c55-c3a66e387a6e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:32:46 compute-0 nova_compute[186479]: 2026-02-17 17:32:46.580 186483 DEBUG oslo_concurrency.lockutils [req-5fab0046-8a8b-46de-a1c5-3c434ac26b7c req-7068c859-adf7-4645-8610-fc5cc8480806 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:32:46 compute-0 nova_compute[186479]: 2026-02-17 17:32:46.580 186483 DEBUG oslo_concurrency.lockutils [req-5fab0046-8a8b-46de-a1c5-3c434ac26b7c req-7068c859-adf7-4645-8610-fc5cc8480806 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:32:46 compute-0 nova_compute[186479]: 2026-02-17 17:32:46.581 186483 DEBUG nova.network.neutron [req-5fab0046-8a8b-46de-a1c5-3c434ac26b7c req-7068c859-adf7-4645-8610-fc5cc8480806 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Refreshing network info cache for port d586c1db-aaaa-45aa-8c55-c3a66e387a6e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:32:46 compute-0 podman[217589]: 2026-02-17 17:32:46.736272143 +0000 UTC m=+0.068728244 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 17 17:32:47 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:32:47.367 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ee0cee2f-3200-4f1f-8903-57b18789347d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:32:47 compute-0 nova_compute[186479]: 2026-02-17 17:32:47.814 186483 DEBUG nova.network.neutron [req-5fab0046-8a8b-46de-a1c5-3c434ac26b7c req-7068c859-adf7-4645-8610-fc5cc8480806 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Updated VIF entry in instance network info cache for port d586c1db-aaaa-45aa-8c55-c3a66e387a6e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:32:47 compute-0 nova_compute[186479]: 2026-02-17 17:32:47.816 186483 DEBUG nova.network.neutron [req-5fab0046-8a8b-46de-a1c5-3c434ac26b7c req-7068c859-adf7-4645-8610-fc5cc8480806 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Updating instance_info_cache with network_info: [{"id": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "address": "fa:16:3e:71:0b:7d", "network": {"id": "4bf50377-716d-42af-ab2c-e962c79e0a2f", "bridge": "br-int", "label": "tempest-network-smoke--1389548266", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd586c1db-aa", "ovs_interfaceid": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:32:47 compute-0 nova_compute[186479]: 2026-02-17 17:32:47.836 186483 DEBUG oslo_concurrency.lockutils [req-5fab0046-8a8b-46de-a1c5-3c434ac26b7c req-7068c859-adf7-4645-8610-fc5cc8480806 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:32:48 compute-0 nova_compute[186479]: 2026-02-17 17:32:48.580 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:51 compute-0 nova_compute[186479]: 2026-02-17 17:32:51.113 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:53 compute-0 nova_compute[186479]: 2026-02-17 17:32:53.608 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:54 compute-0 podman[217632]: 2026-02-17 17:32:54.71527889 +0000 UTC m=+0.056001013 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 17 17:32:54 compute-0 podman[217631]: 2026-02-17 17:32:54.715856093 +0000 UTC m=+0.056664728 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Feb 17 17:32:54 compute-0 ovn_controller[96568]: 2026-02-17T17:32:54Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:71:0b:7d 10.100.0.10
Feb 17 17:32:54 compute-0 ovn_controller[96568]: 2026-02-17T17:32:54Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:71:0b:7d 10.100.0.10
Feb 17 17:32:56 compute-0 nova_compute[186479]: 2026-02-17 17:32:56.115 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:58 compute-0 nova_compute[186479]: 2026-02-17 17:32:58.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:32:58 compute-0 nova_compute[186479]: 2026-02-17 17:32:58.611 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:32:59 compute-0 nova_compute[186479]: 2026-02-17 17:32:59.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:32:59 compute-0 podman[217670]: 2026-02-17 17:32:59.712470192 +0000 UTC m=+0.049117164 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 17 17:33:00 compute-0 nova_compute[186479]: 2026-02-17 17:33:00.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:33:00 compute-0 nova_compute[186479]: 2026-02-17 17:33:00.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:33:00 compute-0 nova_compute[186479]: 2026-02-17 17:33:00.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:33:00 compute-0 nova_compute[186479]: 2026-02-17 17:33:00.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:33:00 compute-0 nova_compute[186479]: 2026-02-17 17:33:00.987 186483 INFO nova.compute.manager [None req-7ea42e3c-c586-42a4-ae51-ee6a131bb99e 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Get console output
Feb 17 17:33:00 compute-0 nova_compute[186479]: 2026-02-17 17:33:00.993 215112 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.118 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.298 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.334 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.335 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.336 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.336 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.405 186483 DEBUG oslo_concurrency.processutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.476 186483 DEBUG oslo_concurrency.processutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.479 186483 DEBUG oslo_concurrency.processutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.545 186483 DEBUG oslo_concurrency.processutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.719 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.720 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5588MB free_disk=73.17782592773438GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.720 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.720 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.793 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Instance 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.794 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.794 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.831 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.843 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.861 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:33:01 compute-0 nova_compute[186479]: 2026-02-17 17:33:01.861 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:03 compute-0 nova_compute[186479]: 2026-02-17 17:33:03.625 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:04 compute-0 nova_compute[186479]: 2026-02-17 17:33:04.007 186483 DEBUG oslo_concurrency.lockutils [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "interface-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:04 compute-0 nova_compute[186479]: 2026-02-17 17:33:04.008 186483 DEBUG oslo_concurrency.lockutils [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "interface-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:04 compute-0 nova_compute[186479]: 2026-02-17 17:33:04.008 186483 DEBUG nova.objects.instance [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'flavor' on Instance uuid 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:33:04 compute-0 nova_compute[186479]: 2026-02-17 17:33:04.558 186483 DEBUG nova.objects.instance [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'pci_requests' on Instance uuid 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:33:04 compute-0 nova_compute[186479]: 2026-02-17 17:33:04.575 186483 DEBUG nova.network.neutron [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 17 17:33:04 compute-0 nova_compute[186479]: 2026-02-17 17:33:04.739 186483 DEBUG nova.policy [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 17 17:33:05 compute-0 nova_compute[186479]: 2026-02-17 17:33:05.334 186483 DEBUG nova.network.neutron [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Successfully created port: 6440ba6f-03ca-4cca-9d42-4f3232f1cbaf _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 17 17:33:05 compute-0 nova_compute[186479]: 2026-02-17 17:33:05.863 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:33:05 compute-0 nova_compute[186479]: 2026-02-17 17:33:05.863 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:33:05 compute-0 nova_compute[186479]: 2026-02-17 17:33:05.908 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 17 17:33:05 compute-0 nova_compute[186479]: 2026-02-17 17:33:05.912 186483 DEBUG nova.network.neutron [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Successfully updated port: 6440ba6f-03ca-4cca-9d42-4f3232f1cbaf _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 17 17:33:05 compute-0 nova_compute[186479]: 2026-02-17 17:33:05.929 186483 DEBUG oslo_concurrency.lockutils [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:33:05 compute-0 nova_compute[186479]: 2026-02-17 17:33:05.929 186483 DEBUG oslo_concurrency.lockutils [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquired lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:33:05 compute-0 nova_compute[186479]: 2026-02-17 17:33:05.930 186483 DEBUG nova.network.neutron [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 17 17:33:06 compute-0 nova_compute[186479]: 2026-02-17 17:33:06.005 186483 DEBUG nova.compute.manager [req-bd3a0780-5086-4f96-a363-3580bf49e375 req-ea730cd4-a937-4b91-89d8-8eb1235c9901 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Received event network-changed-6440ba6f-03ca-4cca-9d42-4f3232f1cbaf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:33:06 compute-0 nova_compute[186479]: 2026-02-17 17:33:06.005 186483 DEBUG nova.compute.manager [req-bd3a0780-5086-4f96-a363-3580bf49e375 req-ea730cd4-a937-4b91-89d8-8eb1235c9901 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Refreshing instance network info cache due to event network-changed-6440ba6f-03ca-4cca-9d42-4f3232f1cbaf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:33:06 compute-0 nova_compute[186479]: 2026-02-17 17:33:06.006 186483 DEBUG oslo_concurrency.lockutils [req-bd3a0780-5086-4f96-a363-3580bf49e375 req-ea730cd4-a937-4b91-89d8-8eb1235c9901 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:33:06 compute-0 nova_compute[186479]: 2026-02-17 17:33:06.122 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.223 186483 DEBUG nova.network.neutron [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Updating instance_info_cache with network_info: [{"id": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "address": "fa:16:3e:71:0b:7d", "network": {"id": "4bf50377-716d-42af-ab2c-e962c79e0a2f", "bridge": "br-int", "label": "tempest-network-smoke--1389548266", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd586c1db-aa", "ovs_interfaceid": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "address": "fa:16:3e:c1:86:a4", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6440ba6f-03", "ovs_interfaceid": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.248 186483 DEBUG oslo_concurrency.lockutils [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Releasing lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.248 186483 DEBUG oslo_concurrency.lockutils [req-bd3a0780-5086-4f96-a363-3580bf49e375 req-ea730cd4-a937-4b91-89d8-8eb1235c9901 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.249 186483 DEBUG nova.network.neutron [req-bd3a0780-5086-4f96-a363-3580bf49e375 req-ea730cd4-a937-4b91-89d8-8eb1235c9901 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Refreshing network info cache for port 6440ba6f-03ca-4cca-9d42-4f3232f1cbaf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.252 186483 DEBUG nova.virt.libvirt.vif [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:32:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2104168838',display_name='tempest-TestNetworkBasicOps-server-2104168838',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2104168838',id=6,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEGSEk4tvtHx+1wxTUolBp5FEKwhcwrFkbyGug9B2dJQk+Gxis2F5tzoHoKU+6EzBSJMG7knNQ1i0UckkdjQ1ANkfH3T0OzsYfzd7j6wv5dyVb9QJBpDJuL+Kt9oifXUkw==',key_name='tempest-TestNetworkBasicOps-347025944',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:32:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-vfuyzqzc',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:32:42Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=3f9e4572-dddc-48ee-8ecf-d52e23be5e3a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "address": "fa:16:3e:c1:86:a4", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6440ba6f-03", "ovs_interfaceid": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.252 186483 DEBUG nova.network.os_vif_util [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "address": "fa:16:3e:c1:86:a4", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6440ba6f-03", "ovs_interfaceid": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.253 186483 DEBUG nova.network.os_vif_util [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:86:a4,bridge_name='br-int',has_traffic_filtering=True,id=6440ba6f-03ca-4cca-9d42-4f3232f1cbaf,network=Network(0a9ae613-f3e7-4402-af22-4077e6ce992d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6440ba6f-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.253 186483 DEBUG os_vif [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:86:a4,bridge_name='br-int',has_traffic_filtering=True,id=6440ba6f-03ca-4cca-9d42-4f3232f1cbaf,network=Network(0a9ae613-f3e7-4402-af22-4077e6ce992d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6440ba6f-03') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.253 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.254 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.254 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.256 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.256 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6440ba6f-03, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.257 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6440ba6f-03, col_values=(('external_ids', {'iface-id': '6440ba6f-03ca-4cca-9d42-4f3232f1cbaf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c1:86:a4', 'vm-uuid': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.258 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:07 compute-0 NetworkManager[56323]: <info>  [1771349587.2599] manager: (tap6440ba6f-03): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.259 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.267 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.268 186483 INFO os_vif [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:86:a4,bridge_name='br-int',has_traffic_filtering=True,id=6440ba6f-03ca-4cca-9d42-4f3232f1cbaf,network=Network(0a9ae613-f3e7-4402-af22-4077e6ce992d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6440ba6f-03')
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.269 186483 DEBUG nova.virt.libvirt.vif [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:32:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2104168838',display_name='tempest-TestNetworkBasicOps-server-2104168838',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2104168838',id=6,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEGSEk4tvtHx+1wxTUolBp5FEKwhcwrFkbyGug9B2dJQk+Gxis2F5tzoHoKU+6EzBSJMG7knNQ1i0UckkdjQ1ANkfH3T0OzsYfzd7j6wv5dyVb9QJBpDJuL+Kt9oifXUkw==',key_name='tempest-TestNetworkBasicOps-347025944',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:32:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-vfuyzqzc',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:32:42Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=3f9e4572-dddc-48ee-8ecf-d52e23be5e3a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "address": "fa:16:3e:c1:86:a4", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6440ba6f-03", "ovs_interfaceid": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.269 186483 DEBUG nova.network.os_vif_util [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "address": "fa:16:3e:c1:86:a4", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6440ba6f-03", "ovs_interfaceid": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.270 186483 DEBUG nova.network.os_vif_util [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:86:a4,bridge_name='br-int',has_traffic_filtering=True,id=6440ba6f-03ca-4cca-9d42-4f3232f1cbaf,network=Network(0a9ae613-f3e7-4402-af22-4077e6ce992d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6440ba6f-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.272 186483 DEBUG nova.virt.libvirt.guest [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] attach device xml: <interface type="ethernet">
Feb 17 17:33:07 compute-0 nova_compute[186479]:   <mac address="fa:16:3e:c1:86:a4"/>
Feb 17 17:33:07 compute-0 nova_compute[186479]:   <model type="virtio"/>
Feb 17 17:33:07 compute-0 nova_compute[186479]:   <driver name="vhost" rx_queue_size="512"/>
Feb 17 17:33:07 compute-0 nova_compute[186479]:   <mtu size="1442"/>
Feb 17 17:33:07 compute-0 nova_compute[186479]:   <target dev="tap6440ba6f-03"/>
Feb 17 17:33:07 compute-0 nova_compute[186479]: </interface>
Feb 17 17:33:07 compute-0 nova_compute[186479]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Feb 17 17:33:07 compute-0 kernel: tap6440ba6f-03: entered promiscuous mode
Feb 17 17:33:07 compute-0 NetworkManager[56323]: <info>  [1771349587.2840] manager: (tap6440ba6f-03): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Feb 17 17:33:07 compute-0 ovn_controller[96568]: 2026-02-17T17:33:07Z|00098|binding|INFO|Claiming lport 6440ba6f-03ca-4cca-9d42-4f3232f1cbaf for this chassis.
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.286 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:07 compute-0 ovn_controller[96568]: 2026-02-17T17:33:07Z|00099|binding|INFO|6440ba6f-03ca-4cca-9d42-4f3232f1cbaf: Claiming fa:16:3e:c1:86:a4 10.100.0.28
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.290 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.294 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:86:a4 10.100.0.28'], port_security=['fa:16:3e:c1:86:a4 10.100.0.28'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.28/28', 'neutron:device_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a9ae613-f3e7-4402-af22-4077e6ce992d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4ea7d121-7f67-4d41-b55a-38229e1e4d1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0614ce48-1760-453d-9a6d-b76cf85eeab3, chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=6440ba6f-03ca-4cca-9d42-4f3232f1cbaf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.296 105898 INFO neutron.agent.ovn.metadata.agent [-] Port 6440ba6f-03ca-4cca-9d42-4f3232f1cbaf in datapath 0a9ae613-f3e7-4402-af22-4077e6ce992d bound to our chassis
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.297 105898 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0a9ae613-f3e7-4402-af22-4077e6ce992d
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.298 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:07 compute-0 ovn_controller[96568]: 2026-02-17T17:33:07Z|00100|binding|INFO|Setting lport 6440ba6f-03ca-4cca-9d42-4f3232f1cbaf ovn-installed in OVS
Feb 17 17:33:07 compute-0 ovn_controller[96568]: 2026-02-17T17:33:07Z|00101|binding|INFO|Setting lport 6440ba6f-03ca-4cca-9d42-4f3232f1cbaf up in Southbound
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.300 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.309 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[10cf0dfb-4811-444b-a3af-3e134bec32ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.310 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0a9ae613-f1 in ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.312 215208 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0a9ae613-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.312 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[fcf6b56b-9d7c-4c2f-a6b0-12d7aff4c11f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.314 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[915e99db-7ad1-47e9-a706-e8c72444d366]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:07 compute-0 systemd-udevd[217709]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:33:07 compute-0 NetworkManager[56323]: <info>  [1771349587.3273] device (tap6440ba6f-03): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 17 17:33:07 compute-0 NetworkManager[56323]: <info>  [1771349587.3280] device (tap6440ba6f-03): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.327 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[6d28a0ce-689e-450b-8e8f-cb4e77dbc4ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.348 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[41eea5a5-aa7f-4afc-8198-4b59b4c0f65f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.369 186483 DEBUG nova.virt.libvirt.driver [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.370 186483 DEBUG nova.virt.libvirt.driver [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.370 186483 DEBUG nova.virt.libvirt.driver [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No VIF found with MAC fa:16:3e:71:0b:7d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.371 186483 DEBUG nova.virt.libvirt.driver [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No VIF found with MAC fa:16:3e:c1:86:a4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.374 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[d91b8b42-e90a-456b-a1f3-527c9083ce31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.378 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[7a7958c8-ca3c-43d1-ab28-c2673f8c789d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:07 compute-0 NetworkManager[56323]: <info>  [1771349587.3790] manager: (tap0a9ae613-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/58)
Feb 17 17:33:07 compute-0 systemd-udevd[217712]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.395 186483 DEBUG nova.virt.libvirt.guest [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:33:07 compute-0 nova_compute[186479]:   <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:33:07 compute-0 nova_compute[186479]:   <nova:name>tempest-TestNetworkBasicOps-server-2104168838</nova:name>
Feb 17 17:33:07 compute-0 nova_compute[186479]:   <nova:creationTime>2026-02-17 17:33:07</nova:creationTime>
Feb 17 17:33:07 compute-0 nova_compute[186479]:   <nova:flavor name="m1.nano">
Feb 17 17:33:07 compute-0 nova_compute[186479]:     <nova:memory>128</nova:memory>
Feb 17 17:33:07 compute-0 nova_compute[186479]:     <nova:disk>1</nova:disk>
Feb 17 17:33:07 compute-0 nova_compute[186479]:     <nova:swap>0</nova:swap>
Feb 17 17:33:07 compute-0 nova_compute[186479]:     <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:33:07 compute-0 nova_compute[186479]:     <nova:vcpus>1</nova:vcpus>
Feb 17 17:33:07 compute-0 nova_compute[186479]:   </nova:flavor>
Feb 17 17:33:07 compute-0 nova_compute[186479]:   <nova:owner>
Feb 17 17:33:07 compute-0 nova_compute[186479]:     <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:33:07 compute-0 nova_compute[186479]:     <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:33:07 compute-0 nova_compute[186479]:   </nova:owner>
Feb 17 17:33:07 compute-0 nova_compute[186479]:   <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:33:07 compute-0 nova_compute[186479]:   <nova:ports>
Feb 17 17:33:07 compute-0 nova_compute[186479]:     <nova:port uuid="d586c1db-aaaa-45aa-8c55-c3a66e387a6e">
Feb 17 17:33:07 compute-0 nova_compute[186479]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 17 17:33:07 compute-0 nova_compute[186479]:     </nova:port>
Feb 17 17:33:07 compute-0 nova_compute[186479]:     <nova:port uuid="6440ba6f-03ca-4cca-9d42-4f3232f1cbaf">
Feb 17 17:33:07 compute-0 nova_compute[186479]:       <nova:ip type="fixed" address="10.100.0.28" ipVersion="4"/>
Feb 17 17:33:07 compute-0 nova_compute[186479]:     </nova:port>
Feb 17 17:33:07 compute-0 nova_compute[186479]:   </nova:ports>
Feb 17 17:33:07 compute-0 nova_compute[186479]: </nova:instance>
Feb 17 17:33:07 compute-0 nova_compute[186479]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.401 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[8e98cedc-7757-44b4-b5e9-77b9632ede0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.403 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[65a36edf-9d0b-4cb1-ac1b-cb7bf26e4349]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.417 186483 DEBUG oslo_concurrency.lockutils [None req-20f9afcb-c571-4f4d-a896-8c80a64300a8 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "interface-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 3.410s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:07 compute-0 NetworkManager[56323]: <info>  [1771349587.4232] device (tap0a9ae613-f0): carrier: link connected
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.429 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[113a0906-7260-4871-8652-d13f3e5337e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.446 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[65160a39-896f-4c84-b42f-0ba14c09fcc6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a9ae613-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:6f:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 334273, 'reachable_time': 36640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217736, 'error': None, 'target': 'ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.458 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[22927387-0f19-4f92-8e62-94a14b12f4ab]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe38:6fb6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 334273, 'tstamp': 334273}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217737, 'error': None, 'target': 'ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.474 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[df3bf7c8-3dcc-41e5-8435-158e9d7d15c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a9ae613-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:6f:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 334273, 'reachable_time': 36640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217738, 'error': None, 'target': 'ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.505 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[f0bf3573-cb6d-4dd2-bc5b-4933033528de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.561 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[ea1de3c9-767f-4737-95c9-3109d3096ac6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.563 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a9ae613-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.563 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.563 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a9ae613-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.565 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:07 compute-0 NetworkManager[56323]: <info>  [1771349587.5660] manager: (tap0a9ae613-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Feb 17 17:33:07 compute-0 kernel: tap0a9ae613-f0: entered promiscuous mode
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.570 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0a9ae613-f0, col_values=(('external_ids', {'iface-id': 'c51c13dd-c5ca-4593-a4ef-aa6abe6be705'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.571 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:07 compute-0 ovn_controller[96568]: 2026-02-17T17:33:07Z|00102|binding|INFO|Releasing lport c51c13dd-c5ca-4593-a4ef-aa6abe6be705 from this chassis (sb_readonly=0)
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.572 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.573 105898 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a9ae613-f3e7-4402-af22-4077e6ce992d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a9ae613-f3e7-4402-af22-4077e6ce992d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.574 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[16e84275-0a8a-4827-ad09-8e2c3a110d6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.574 105898 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: global
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:     log         /dev/log local0 debug
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:     log-tag     haproxy-metadata-proxy-0a9ae613-f3e7-4402-af22-4077e6ce992d
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:     user        root
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:     group       root
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:     maxconn     1024
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:     pidfile     /var/lib/neutron/external/pids/0a9ae613-f3e7-4402-af22-4077e6ce992d.pid.haproxy
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:     daemon
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: defaults
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:     log global
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:     mode http
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:     option httplog
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:     option dontlognull
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:     option http-server-close
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:     option forwardfor
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:     retries                 3
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:     timeout http-request    30s
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:     timeout connect         30s
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:     timeout client          32s
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:     timeout server          32s
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:     timeout http-keep-alive 30s
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: listen listener
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:     bind 169.254.169.254:80
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:     server metadata /var/lib/neutron/metadata_proxy
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:     http-request add-header X-OVN-Network-ID 0a9ae613-f3e7-4402-af22-4077e6ce992d
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 17 17:33:07 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:07.575 105898 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d', 'env', 'PROCESS_TAG=haproxy-0a9ae613-f3e7-4402-af22-4077e6ce992d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0a9ae613-f3e7-4402-af22-4077e6ce992d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 17 17:33:07 compute-0 nova_compute[186479]: 2026-02-17 17:33:07.576 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:07 compute-0 podman[217770]: 2026-02-17 17:33:07.913958116 +0000 UTC m=+0.048444838 container create 703d72bf948ad650ff1719590c4a4aff8f1d05d6e4e19607b34b61e27173d933 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 17 17:33:07 compute-0 systemd[1]: Started libpod-conmon-703d72bf948ad650ff1719590c4a4aff8f1d05d6e4e19607b34b61e27173d933.scope.
Feb 17 17:33:07 compute-0 systemd[1]: Started libcrun container.
Feb 17 17:33:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/310c89dc0e85e6a75d21a9dbed2cc39edab91408fb2ac88656c51957472cd789/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 17 17:33:07 compute-0 podman[217770]: 2026-02-17 17:33:07.888463022 +0000 UTC m=+0.022949784 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 17 17:33:07 compute-0 podman[217770]: 2026-02-17 17:33:07.993416842 +0000 UTC m=+0.127903574 container init 703d72bf948ad650ff1719590c4a4aff8f1d05d6e4e19607b34b61e27173d933 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 17 17:33:08 compute-0 podman[217770]: 2026-02-17 17:33:08.000040064 +0000 UTC m=+0.134526776 container start 703d72bf948ad650ff1719590c4a4aff8f1d05d6e4e19607b34b61e27173d933 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 17 17:33:08 compute-0 neutron-haproxy-ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d[217787]: [NOTICE]   (217810) : New worker (217815) forked
Feb 17 17:33:08 compute-0 neutron-haproxy-ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d[217787]: [NOTICE]   (217810) : Loading success.
Feb 17 17:33:08 compute-0 podman[217784]: 2026-02-17 17:33:08.037219585 +0000 UTC m=+0.084548173 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 17 17:33:08 compute-0 nova_compute[186479]: 2026-02-17 17:33:08.673 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:09 compute-0 ovn_controller[96568]: 2026-02-17T17:33:09Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c1:86:a4 10.100.0.28
Feb 17 17:33:09 compute-0 ovn_controller[96568]: 2026-02-17T17:33:09Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c1:86:a4 10.100.0.28
Feb 17 17:33:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:10.952 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:10.953 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:10.953 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:11 compute-0 nova_compute[186479]: 2026-02-17 17:33:11.702 186483 DEBUG nova.compute.manager [req-6f17b7c7-7b83-4b17-b1e6-d9642bddd72e req-0e4c9a77-6506-4d2a-a003-3741f4832aef 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Received event network-vif-plugged-6440ba6f-03ca-4cca-9d42-4f3232f1cbaf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:33:11 compute-0 nova_compute[186479]: 2026-02-17 17:33:11.702 186483 DEBUG oslo_concurrency.lockutils [req-6f17b7c7-7b83-4b17-b1e6-d9642bddd72e req-0e4c9a77-6506-4d2a-a003-3741f4832aef 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:11 compute-0 nova_compute[186479]: 2026-02-17 17:33:11.702 186483 DEBUG oslo_concurrency.lockutils [req-6f17b7c7-7b83-4b17-b1e6-d9642bddd72e req-0e4c9a77-6506-4d2a-a003-3741f4832aef 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:11 compute-0 nova_compute[186479]: 2026-02-17 17:33:11.703 186483 DEBUG oslo_concurrency.lockutils [req-6f17b7c7-7b83-4b17-b1e6-d9642bddd72e req-0e4c9a77-6506-4d2a-a003-3741f4832aef 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:11 compute-0 nova_compute[186479]: 2026-02-17 17:33:11.703 186483 DEBUG nova.compute.manager [req-6f17b7c7-7b83-4b17-b1e6-d9642bddd72e req-0e4c9a77-6506-4d2a-a003-3741f4832aef 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] No waiting events found dispatching network-vif-plugged-6440ba6f-03ca-4cca-9d42-4f3232f1cbaf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:33:11 compute-0 nova_compute[186479]: 2026-02-17 17:33:11.703 186483 WARNING nova.compute.manager [req-6f17b7c7-7b83-4b17-b1e6-d9642bddd72e req-0e4c9a77-6506-4d2a-a003-3741f4832aef 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Received unexpected event network-vif-plugged-6440ba6f-03ca-4cca-9d42-4f3232f1cbaf for instance with vm_state active and task_state None.
Feb 17 17:33:11 compute-0 podman[217827]: 2026-02-17 17:33:11.751607344 +0000 UTC m=+0.069545435 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 17 17:33:12 compute-0 nova_compute[186479]: 2026-02-17 17:33:12.226 186483 DEBUG nova.network.neutron [req-bd3a0780-5086-4f96-a363-3580bf49e375 req-ea730cd4-a937-4b91-89d8-8eb1235c9901 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Updated VIF entry in instance network info cache for port 6440ba6f-03ca-4cca-9d42-4f3232f1cbaf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:33:12 compute-0 nova_compute[186479]: 2026-02-17 17:33:12.227 186483 DEBUG nova.network.neutron [req-bd3a0780-5086-4f96-a363-3580bf49e375 req-ea730cd4-a937-4b91-89d8-8eb1235c9901 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Updating instance_info_cache with network_info: [{"id": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "address": "fa:16:3e:71:0b:7d", "network": {"id": "4bf50377-716d-42af-ab2c-e962c79e0a2f", "bridge": "br-int", "label": "tempest-network-smoke--1389548266", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd586c1db-aa", "ovs_interfaceid": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "address": "fa:16:3e:c1:86:a4", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6440ba6f-03", "ovs_interfaceid": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:33:12 compute-0 nova_compute[186479]: 2026-02-17 17:33:12.243 186483 DEBUG oslo_concurrency.lockutils [req-bd3a0780-5086-4f96-a363-3580bf49e375 req-ea730cd4-a937-4b91-89d8-8eb1235c9901 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:33:12 compute-0 nova_compute[186479]: 2026-02-17 17:33:12.258 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:13 compute-0 nova_compute[186479]: 2026-02-17 17:33:13.676 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:13 compute-0 nova_compute[186479]: 2026-02-17 17:33:13.793 186483 DEBUG nova.compute.manager [req-ea3284d3-857d-4041-a237-62bb427e238c req-bb4c026f-082b-420e-b40b-5194f1a4ccc9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Received event network-vif-plugged-6440ba6f-03ca-4cca-9d42-4f3232f1cbaf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:33:13 compute-0 nova_compute[186479]: 2026-02-17 17:33:13.793 186483 DEBUG oslo_concurrency.lockutils [req-ea3284d3-857d-4041-a237-62bb427e238c req-bb4c026f-082b-420e-b40b-5194f1a4ccc9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:13 compute-0 nova_compute[186479]: 2026-02-17 17:33:13.794 186483 DEBUG oslo_concurrency.lockutils [req-ea3284d3-857d-4041-a237-62bb427e238c req-bb4c026f-082b-420e-b40b-5194f1a4ccc9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:13 compute-0 nova_compute[186479]: 2026-02-17 17:33:13.794 186483 DEBUG oslo_concurrency.lockutils [req-ea3284d3-857d-4041-a237-62bb427e238c req-bb4c026f-082b-420e-b40b-5194f1a4ccc9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:13 compute-0 nova_compute[186479]: 2026-02-17 17:33:13.794 186483 DEBUG nova.compute.manager [req-ea3284d3-857d-4041-a237-62bb427e238c req-bb4c026f-082b-420e-b40b-5194f1a4ccc9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] No waiting events found dispatching network-vif-plugged-6440ba6f-03ca-4cca-9d42-4f3232f1cbaf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:33:13 compute-0 nova_compute[186479]: 2026-02-17 17:33:13.794 186483 WARNING nova.compute.manager [req-ea3284d3-857d-4041-a237-62bb427e238c req-bb4c026f-082b-420e-b40b-5194f1a4ccc9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Received unexpected event network-vif-plugged-6440ba6f-03ca-4cca-9d42-4f3232f1cbaf for instance with vm_state active and task_state None.
Feb 17 17:33:17 compute-0 nova_compute[186479]: 2026-02-17 17:33:17.261 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:17 compute-0 nova_compute[186479]: 2026-02-17 17:33:17.597 186483 DEBUG oslo_concurrency.lockutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "8b2c187c-d396-49a6-bc6b-a784ab66a468" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:17 compute-0 nova_compute[186479]: 2026-02-17 17:33:17.597 186483 DEBUG oslo_concurrency.lockutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "8b2c187c-d396-49a6-bc6b-a784ab66a468" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:17 compute-0 nova_compute[186479]: 2026-02-17 17:33:17.615 186483 DEBUG nova.compute.manager [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 17 17:33:17 compute-0 podman[217851]: 2026-02-17 17:33:17.716021265 +0000 UTC m=+0.056546257 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-type=git, release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 17 17:33:17 compute-0 nova_compute[186479]: 2026-02-17 17:33:17.735 186483 DEBUG oslo_concurrency.lockutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:17 compute-0 nova_compute[186479]: 2026-02-17 17:33:17.736 186483 DEBUG oslo_concurrency.lockutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:17 compute-0 nova_compute[186479]: 2026-02-17 17:33:17.744 186483 DEBUG nova.virt.hardware [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 17 17:33:17 compute-0 nova_compute[186479]: 2026-02-17 17:33:17.744 186483 INFO nova.compute.claims [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Claim successful on node compute-0.ctlplane.example.com
Feb 17 17:33:17 compute-0 nova_compute[186479]: 2026-02-17 17:33:17.891 186483 DEBUG nova.compute.provider_tree [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:33:17 compute-0 nova_compute[186479]: 2026-02-17 17:33:17.904 186483 DEBUG nova.scheduler.client.report [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:33:17 compute-0 nova_compute[186479]: 2026-02-17 17:33:17.923 186483 DEBUG oslo_concurrency.lockutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:17 compute-0 nova_compute[186479]: 2026-02-17 17:33:17.924 186483 DEBUG nova.compute.manager [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 17 17:33:17 compute-0 nova_compute[186479]: 2026-02-17 17:33:17.963 186483 DEBUG nova.compute.manager [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 17 17:33:17 compute-0 nova_compute[186479]: 2026-02-17 17:33:17.963 186483 DEBUG nova.network.neutron [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 17 17:33:17 compute-0 nova_compute[186479]: 2026-02-17 17:33:17.978 186483 INFO nova.virt.libvirt.driver [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 17 17:33:17 compute-0 nova_compute[186479]: 2026-02-17 17:33:17.998 186483 DEBUG nova.compute.manager [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.096 186483 DEBUG nova.compute.manager [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.097 186483 DEBUG nova.virt.libvirt.driver [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.097 186483 INFO nova.virt.libvirt.driver [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Creating image(s)
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.098 186483 DEBUG oslo_concurrency.lockutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "/var/lib/nova/instances/8b2c187c-d396-49a6-bc6b-a784ab66a468/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.098 186483 DEBUG oslo_concurrency.lockutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/8b2c187c-d396-49a6-bc6b-a784ab66a468/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.099 186483 DEBUG oslo_concurrency.lockutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/8b2c187c-d396-49a6-bc6b-a784ab66a468/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.114 186483 DEBUG oslo_concurrency.processutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.134 186483 DEBUG nova.policy [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.185 186483 DEBUG oslo_concurrency.processutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.186 186483 DEBUG oslo_concurrency.lockutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.187 186483 DEBUG oslo_concurrency.lockutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.216 186483 DEBUG oslo_concurrency.processutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.268 186483 DEBUG oslo_concurrency.processutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.270 186483 DEBUG oslo_concurrency.processutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/8b2c187c-d396-49a6-bc6b-a784ab66a468/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.306 186483 DEBUG oslo_concurrency.processutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/8b2c187c-d396-49a6-bc6b-a784ab66a468/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.307 186483 DEBUG oslo_concurrency.lockutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.308 186483 DEBUG oslo_concurrency.processutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.357 186483 DEBUG oslo_concurrency.processutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.358 186483 DEBUG nova.virt.disk.api [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Checking if we can resize image /var/lib/nova/instances/8b2c187c-d396-49a6-bc6b-a784ab66a468/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.358 186483 DEBUG oslo_concurrency.processutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b2c187c-d396-49a6-bc6b-a784ab66a468/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.410 186483 DEBUG oslo_concurrency.processutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b2c187c-d396-49a6-bc6b-a784ab66a468/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.411 186483 DEBUG nova.virt.disk.api [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Cannot resize image /var/lib/nova/instances/8b2c187c-d396-49a6-bc6b-a784ab66a468/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.412 186483 DEBUG nova.objects.instance [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'migration_context' on Instance uuid 8b2c187c-d396-49a6-bc6b-a784ab66a468 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.430 186483 DEBUG nova.virt.libvirt.driver [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.430 186483 DEBUG nova.virt.libvirt.driver [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Ensure instance console log exists: /var/lib/nova/instances/8b2c187c-d396-49a6-bc6b-a784ab66a468/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.431 186483 DEBUG oslo_concurrency.lockutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.431 186483 DEBUG oslo_concurrency.lockutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.431 186483 DEBUG oslo_concurrency.lockutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:18 compute-0 nova_compute[186479]: 2026-02-17 17:33:18.703 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:20 compute-0 nova_compute[186479]: 2026-02-17 17:33:20.508 186483 DEBUG nova.network.neutron [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Successfully created port: 4a5c580c-8f81-428a-9cfb-05b0ce894f36 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 17 17:33:21 compute-0 sshd-session[217887]: Connection closed by authenticating user root 209.38.233.161 port 37930 [preauth]
Feb 17 17:33:22 compute-0 nova_compute[186479]: 2026-02-17 17:33:22.301 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:22 compute-0 nova_compute[186479]: 2026-02-17 17:33:22.585 186483 DEBUG nova.network.neutron [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Successfully updated port: 4a5c580c-8f81-428a-9cfb-05b0ce894f36 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 17 17:33:22 compute-0 nova_compute[186479]: 2026-02-17 17:33:22.602 186483 DEBUG oslo_concurrency.lockutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "refresh_cache-8b2c187c-d396-49a6-bc6b-a784ab66a468" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:33:22 compute-0 nova_compute[186479]: 2026-02-17 17:33:22.602 186483 DEBUG oslo_concurrency.lockutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquired lock "refresh_cache-8b2c187c-d396-49a6-bc6b-a784ab66a468" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:33:22 compute-0 nova_compute[186479]: 2026-02-17 17:33:22.603 186483 DEBUG nova.network.neutron [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 17 17:33:22 compute-0 nova_compute[186479]: 2026-02-17 17:33:22.703 186483 DEBUG nova.compute.manager [req-fb71ce47-85e6-41be-b8d2-a8955797a99a req-0a81ab61-91e1-4196-a58a-25d1979ddd08 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Received event network-changed-4a5c580c-8f81-428a-9cfb-05b0ce894f36 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:33:22 compute-0 nova_compute[186479]: 2026-02-17 17:33:22.704 186483 DEBUG nova.compute.manager [req-fb71ce47-85e6-41be-b8d2-a8955797a99a req-0a81ab61-91e1-4196-a58a-25d1979ddd08 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Refreshing instance network info cache due to event network-changed-4a5c580c-8f81-428a-9cfb-05b0ce894f36. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:33:22 compute-0 nova_compute[186479]: 2026-02-17 17:33:22.705 186483 DEBUG oslo_concurrency.lockutils [req-fb71ce47-85e6-41be-b8d2-a8955797a99a req-0a81ab61-91e1-4196-a58a-25d1979ddd08 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-8b2c187c-d396-49a6-bc6b-a784ab66a468" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:33:23 compute-0 nova_compute[186479]: 2026-02-17 17:33:23.348 186483 DEBUG nova.network.neutron [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 17 17:33:23 compute-0 nova_compute[186479]: 2026-02-17 17:33:23.750 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.390 186483 DEBUG nova.network.neutron [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Updating instance_info_cache with network_info: [{"id": "4a5c580c-8f81-428a-9cfb-05b0ce894f36", "address": "fa:16:3e:32:20:0b", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a5c580c-8f", "ovs_interfaceid": "4a5c580c-8f81-428a-9cfb-05b0ce894f36", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.419 186483 DEBUG oslo_concurrency.lockutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Releasing lock "refresh_cache-8b2c187c-d396-49a6-bc6b-a784ab66a468" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.420 186483 DEBUG nova.compute.manager [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Instance network_info: |[{"id": "4a5c580c-8f81-428a-9cfb-05b0ce894f36", "address": "fa:16:3e:32:20:0b", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a5c580c-8f", "ovs_interfaceid": "4a5c580c-8f81-428a-9cfb-05b0ce894f36", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.421 186483 DEBUG oslo_concurrency.lockutils [req-fb71ce47-85e6-41be-b8d2-a8955797a99a req-0a81ab61-91e1-4196-a58a-25d1979ddd08 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-8b2c187c-d396-49a6-bc6b-a784ab66a468" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.421 186483 DEBUG nova.network.neutron [req-fb71ce47-85e6-41be-b8d2-a8955797a99a req-0a81ab61-91e1-4196-a58a-25d1979ddd08 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Refreshing network info cache for port 4a5c580c-8f81-428a-9cfb-05b0ce894f36 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.424 186483 DEBUG nova.virt.libvirt.driver [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Start _get_guest_xml network_info=[{"id": "4a5c580c-8f81-428a-9cfb-05b0ce894f36", "address": "fa:16:3e:32:20:0b", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a5c580c-8f", "ovs_interfaceid": "4a5c580c-8f81-428a-9cfb-05b0ce894f36", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_format': None, 'encryption_options': None, 'size': 0, 'encrypted': False, 'image_id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.431 186483 WARNING nova.virt.libvirt.driver [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.437 186483 DEBUG nova.virt.libvirt.host [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.438 186483 DEBUG nova.virt.libvirt.host [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.442 186483 DEBUG nova.virt.libvirt.host [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.442 186483 DEBUG nova.virt.libvirt.host [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.443 186483 DEBUG nova.virt.libvirt.driver [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.443 186483 DEBUG nova.virt.hardware [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-17T17:28:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='5ebd41ec-8360-4181-bce3-0c0dc586cdb2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.444 186483 DEBUG nova.virt.hardware [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.444 186483 DEBUG nova.virt.hardware [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.444 186483 DEBUG nova.virt.hardware [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.445 186483 DEBUG nova.virt.hardware [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.445 186483 DEBUG nova.virt.hardware [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.445 186483 DEBUG nova.virt.hardware [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.445 186483 DEBUG nova.virt.hardware [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.446 186483 DEBUG nova.virt.hardware [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.446 186483 DEBUG nova.virt.hardware [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.446 186483 DEBUG nova.virt.hardware [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.451 186483 DEBUG nova.virt.libvirt.vif [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:33:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-332137290',display_name='tempest-TestNetworkBasicOps-server-332137290',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-332137290',id=7,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDqubP3j3ONueIiHFyzXtRM0URmPjGw9t6js/e76shcAk3x1UJBOXDWolRvWenawlzqoUbZHdPsKvxLR8GE4untTlcIllm0bgOiuTQPR4oxuyVDkaiZtq4Ltt50HEZzWQQ==',key_name='tempest-TestNetworkBasicOps-43454294',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-0x5t0m19',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:33:18Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=8b2c187c-d396-49a6-bc6b-a784ab66a468,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4a5c580c-8f81-428a-9cfb-05b0ce894f36", "address": "fa:16:3e:32:20:0b", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a5c580c-8f", "ovs_interfaceid": "4a5c580c-8f81-428a-9cfb-05b0ce894f36", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.451 186483 DEBUG nova.network.os_vif_util [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "4a5c580c-8f81-428a-9cfb-05b0ce894f36", "address": "fa:16:3e:32:20:0b", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a5c580c-8f", "ovs_interfaceid": "4a5c580c-8f81-428a-9cfb-05b0ce894f36", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.452 186483 DEBUG nova.network.os_vif_util [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:20:0b,bridge_name='br-int',has_traffic_filtering=True,id=4a5c580c-8f81-428a-9cfb-05b0ce894f36,network=Network(0a9ae613-f3e7-4402-af22-4077e6ce992d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a5c580c-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.453 186483 DEBUG nova.objects.instance [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'pci_devices' on Instance uuid 8b2c187c-d396-49a6-bc6b-a784ab66a468 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.501 186483 DEBUG nova.virt.libvirt.driver [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] End _get_guest_xml xml=<domain type="kvm">
Feb 17 17:33:24 compute-0 nova_compute[186479]:   <uuid>8b2c187c-d396-49a6-bc6b-a784ab66a468</uuid>
Feb 17 17:33:24 compute-0 nova_compute[186479]:   <name>instance-00000007</name>
Feb 17 17:33:24 compute-0 nova_compute[186479]:   <memory>131072</memory>
Feb 17 17:33:24 compute-0 nova_compute[186479]:   <vcpu>1</vcpu>
Feb 17 17:33:24 compute-0 nova_compute[186479]:   <metadata>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <nova:name>tempest-TestNetworkBasicOps-server-332137290</nova:name>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <nova:creationTime>2026-02-17 17:33:24</nova:creationTime>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <nova:flavor name="m1.nano">
Feb 17 17:33:24 compute-0 nova_compute[186479]:         <nova:memory>128</nova:memory>
Feb 17 17:33:24 compute-0 nova_compute[186479]:         <nova:disk>1</nova:disk>
Feb 17 17:33:24 compute-0 nova_compute[186479]:         <nova:swap>0</nova:swap>
Feb 17 17:33:24 compute-0 nova_compute[186479]:         <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:33:24 compute-0 nova_compute[186479]:         <nova:vcpus>1</nova:vcpus>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       </nova:flavor>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <nova:owner>
Feb 17 17:33:24 compute-0 nova_compute[186479]:         <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:33:24 compute-0 nova_compute[186479]:         <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       </nova:owner>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <nova:ports>
Feb 17 17:33:24 compute-0 nova_compute[186479]:         <nova:port uuid="4a5c580c-8f81-428a-9cfb-05b0ce894f36">
Feb 17 17:33:24 compute-0 nova_compute[186479]:           <nova:ip type="fixed" address="10.100.0.27" ipVersion="4"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:         </nova:port>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       </nova:ports>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     </nova:instance>
Feb 17 17:33:24 compute-0 nova_compute[186479]:   </metadata>
Feb 17 17:33:24 compute-0 nova_compute[186479]:   <sysinfo type="smbios">
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <system>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <entry name="manufacturer">RDO</entry>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <entry name="product">OpenStack Compute</entry>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <entry name="serial">8b2c187c-d396-49a6-bc6b-a784ab66a468</entry>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <entry name="uuid">8b2c187c-d396-49a6-bc6b-a784ab66a468</entry>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <entry name="family">Virtual Machine</entry>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     </system>
Feb 17 17:33:24 compute-0 nova_compute[186479]:   </sysinfo>
Feb 17 17:33:24 compute-0 nova_compute[186479]:   <os>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <boot dev="hd"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <smbios mode="sysinfo"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:   </os>
Feb 17 17:33:24 compute-0 nova_compute[186479]:   <features>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <acpi/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <apic/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <vmcoreinfo/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:   </features>
Feb 17 17:33:24 compute-0 nova_compute[186479]:   <clock offset="utc">
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <timer name="pit" tickpolicy="delay"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <timer name="hpet" present="no"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:   </clock>
Feb 17 17:33:24 compute-0 nova_compute[186479]:   <cpu mode="host-model" match="exact">
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <topology sockets="1" cores="1" threads="1"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:33:24 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <disk type="file" device="disk">
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/8b2c187c-d396-49a6-bc6b-a784ab66a468/disk"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <target dev="vda" bus="virtio"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <disk type="file" device="cdrom">
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <driver name="qemu" type="raw" cache="none"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/8b2c187c-d396-49a6-bc6b-a784ab66a468/disk.config"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <target dev="sda" bus="sata"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <interface type="ethernet">
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <mac address="fa:16:3e:32:20:0b"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <driver name="vhost" rx_queue_size="512"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <mtu size="1442"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <target dev="tap4a5c580c-8f"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <serial type="pty">
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <log file="/var/lib/nova/instances/8b2c187c-d396-49a6-bc6b-a784ab66a468/console.log" append="off"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     </serial>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <video>
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     </video>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <input type="tablet" bus="usb"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <rng model="virtio">
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <backend model="random">/dev/urandom</backend>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <controller type="usb" index="0"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     <memballoon model="virtio">
Feb 17 17:33:24 compute-0 nova_compute[186479]:       <stats period="10"/>
Feb 17 17:33:24 compute-0 nova_compute[186479]:     </memballoon>
Feb 17 17:33:24 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:33:24 compute-0 nova_compute[186479]: </domain>
Feb 17 17:33:24 compute-0 nova_compute[186479]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.503 186483 DEBUG nova.compute.manager [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Preparing to wait for external event network-vif-plugged-4a5c580c-8f81-428a-9cfb-05b0ce894f36 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.503 186483 DEBUG oslo_concurrency.lockutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "8b2c187c-d396-49a6-bc6b-a784ab66a468-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.503 186483 DEBUG oslo_concurrency.lockutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "8b2c187c-d396-49a6-bc6b-a784ab66a468-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.503 186483 DEBUG oslo_concurrency.lockutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "8b2c187c-d396-49a6-bc6b-a784ab66a468-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.504 186483 DEBUG nova.virt.libvirt.vif [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:33:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-332137290',display_name='tempest-TestNetworkBasicOps-server-332137290',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-332137290',id=7,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDqubP3j3ONueIiHFyzXtRM0URmPjGw9t6js/e76shcAk3x1UJBOXDWolRvWenawlzqoUbZHdPsKvxLR8GE4untTlcIllm0bgOiuTQPR4oxuyVDkaiZtq4Ltt50HEZzWQQ==',key_name='tempest-TestNetworkBasicOps-43454294',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-0x5t0m19',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:33:18Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=8b2c187c-d396-49a6-bc6b-a784ab66a468,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4a5c580c-8f81-428a-9cfb-05b0ce894f36", "address": "fa:16:3e:32:20:0b", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a5c580c-8f", "ovs_interfaceid": "4a5c580c-8f81-428a-9cfb-05b0ce894f36", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.505 186483 DEBUG nova.network.os_vif_util [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "4a5c580c-8f81-428a-9cfb-05b0ce894f36", "address": "fa:16:3e:32:20:0b", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a5c580c-8f", "ovs_interfaceid": "4a5c580c-8f81-428a-9cfb-05b0ce894f36", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.505 186483 DEBUG nova.network.os_vif_util [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:20:0b,bridge_name='br-int',has_traffic_filtering=True,id=4a5c580c-8f81-428a-9cfb-05b0ce894f36,network=Network(0a9ae613-f3e7-4402-af22-4077e6ce992d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a5c580c-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.506 186483 DEBUG os_vif [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:20:0b,bridge_name='br-int',has_traffic_filtering=True,id=4a5c580c-8f81-428a-9cfb-05b0ce894f36,network=Network(0a9ae613-f3e7-4402-af22-4077e6ce992d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a5c580c-8f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.507 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.507 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.508 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.513 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.514 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a5c580c-8f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.515 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4a5c580c-8f, col_values=(('external_ids', {'iface-id': '4a5c580c-8f81-428a-9cfb-05b0ce894f36', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:32:20:0b', 'vm-uuid': '8b2c187c-d396-49a6-bc6b-a784ab66a468'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.517 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:24 compute-0 NetworkManager[56323]: <info>  [1771349604.5181] manager: (tap4a5c580c-8f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.520 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.522 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.523 186483 INFO os_vif [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:20:0b,bridge_name='br-int',has_traffic_filtering=True,id=4a5c580c-8f81-428a-9cfb-05b0ce894f36,network=Network(0a9ae613-f3e7-4402-af22-4077e6ce992d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a5c580c-8f')
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.575 186483 DEBUG nova.virt.libvirt.driver [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.576 186483 DEBUG nova.virt.libvirt.driver [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.576 186483 DEBUG nova.virt.libvirt.driver [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No VIF found with MAC fa:16:3e:32:20:0b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 17 17:33:24 compute-0 nova_compute[186479]: 2026-02-17 17:33:24.576 186483 INFO nova.virt.libvirt.driver [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Using config drive
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.353 186483 INFO nova.virt.libvirt.driver [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Creating config drive at /var/lib/nova/instances/8b2c187c-d396-49a6-bc6b-a784ab66a468/disk.config
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.357 186483 DEBUG oslo_concurrency.processutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8b2c187c-d396-49a6-bc6b-a784ab66a468/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpqs1pbvm0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.486 186483 DEBUG oslo_concurrency.processutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8b2c187c-d396-49a6-bc6b-a784ab66a468/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpqs1pbvm0" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:33:25 compute-0 kernel: tap4a5c580c-8f: entered promiscuous mode
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.557 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:25 compute-0 ovn_controller[96568]: 2026-02-17T17:33:25Z|00103|binding|INFO|Claiming lport 4a5c580c-8f81-428a-9cfb-05b0ce894f36 for this chassis.
Feb 17 17:33:25 compute-0 ovn_controller[96568]: 2026-02-17T17:33:25Z|00104|binding|INFO|4a5c580c-8f81-428a-9cfb-05b0ce894f36: Claiming fa:16:3e:32:20:0b 10.100.0.27
Feb 17 17:33:25 compute-0 NetworkManager[56323]: <info>  [1771349605.5602] manager: (tap4a5c580c-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Feb 17 17:33:25 compute-0 ovn_controller[96568]: 2026-02-17T17:33:25Z|00105|binding|INFO|Setting lport 4a5c580c-8f81-428a-9cfb-05b0ce894f36 ovn-installed in OVS
Feb 17 17:33:25 compute-0 ovn_controller[96568]: 2026-02-17T17:33:25Z|00106|binding|INFO|Setting lport 4a5c580c-8f81-428a-9cfb-05b0ce894f36 up in Southbound
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.567 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:25 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:25.567 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:20:0b 10.100.0.27'], port_security=['fa:16:3e:32:20:0b 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': '8b2c187c-d396-49a6-bc6b-a784ab66a468', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a9ae613-f3e7-4402-af22-4077e6ce992d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '663bd69a-2dae-4fb8-bff3-f4bda4cd5914', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0614ce48-1760-453d-9a6d-b76cf85eeab3, chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=4a5c580c-8f81-428a-9cfb-05b0ce894f36) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:33:25 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:25.569 105898 INFO neutron.agent.ovn.metadata.agent [-] Port 4a5c580c-8f81-428a-9cfb-05b0ce894f36 in datapath 0a9ae613-f3e7-4402-af22-4077e6ce992d bound to our chassis
Feb 17 17:33:25 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:25.570 105898 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0a9ae613-f3e7-4402-af22-4077e6ce992d
Feb 17 17:33:25 compute-0 systemd-machined[155877]: New machine qemu-7-instance-00000007.
Feb 17 17:33:25 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:25.588 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[46064332-72e7-4a9d-b787-f0053d63948b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:25 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000007.
Feb 17 17:33:25 compute-0 podman[217902]: 2026-02-17 17:33:25.617602893 +0000 UTC m=+0.070293362 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 17 17:33:25 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:25.617 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[1ef7a1e4-7b24-48c0-b94c-c8b339de6a37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:25 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:25.622 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[e64db21c-77f6-4188-8b92-59bf76b0cc9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:25 compute-0 systemd-udevd[217949]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:33:25 compute-0 podman[217901]: 2026-02-17 17:33:25.640726119 +0000 UTC m=+0.095749666 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 17 17:33:25 compute-0 NetworkManager[56323]: <info>  [1771349605.6464] device (tap4a5c580c-8f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 17 17:33:25 compute-0 NetworkManager[56323]: <info>  [1771349605.6474] device (tap4a5c580c-8f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 17 17:33:25 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:25.649 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[74a72ed3-1d17-4c74-9e55-382bb33ac69b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:25 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:25.668 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[55b6fc0a-9159-458b-a426-542c23260f54]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a9ae613-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:6f:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 334273, 'reachable_time': 36640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217958, 'error': None, 'target': 'ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:25 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:25.685 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[beb8874a-372c-487b-8c64-dacf6844e307]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0a9ae613-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 334283, 'tstamp': 334283}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217960, 'error': None, 'target': 'ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tap0a9ae613-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 334286, 'tstamp': 334286}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217960, 'error': None, 'target': 'ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:25 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:25.687 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a9ae613-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.690 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:25 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:25.690 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a9ae613-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:33:25 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:25.690 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:33:25 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:25.691 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0a9ae613-f0, col_values=(('external_ids', {'iface-id': 'c51c13dd-c5ca-4593-a4ef-aa6abe6be705'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:33:25 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:25.691 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.789 186483 DEBUG nova.compute.manager [req-f0795245-58cf-4652-b6e3-cd34e01966d6 req-bedcbd40-f390-48f4-9269-22250ac39f75 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Received event network-vif-plugged-4a5c580c-8f81-428a-9cfb-05b0ce894f36 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.790 186483 DEBUG oslo_concurrency.lockutils [req-f0795245-58cf-4652-b6e3-cd34e01966d6 req-bedcbd40-f390-48f4-9269-22250ac39f75 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "8b2c187c-d396-49a6-bc6b-a784ab66a468-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.790 186483 DEBUG oslo_concurrency.lockutils [req-f0795245-58cf-4652-b6e3-cd34e01966d6 req-bedcbd40-f390-48f4-9269-22250ac39f75 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "8b2c187c-d396-49a6-bc6b-a784ab66a468-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.791 186483 DEBUG oslo_concurrency.lockutils [req-f0795245-58cf-4652-b6e3-cd34e01966d6 req-bedcbd40-f390-48f4-9269-22250ac39f75 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "8b2c187c-d396-49a6-bc6b-a784ab66a468-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.791 186483 DEBUG nova.compute.manager [req-f0795245-58cf-4652-b6e3-cd34e01966d6 req-bedcbd40-f390-48f4-9269-22250ac39f75 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Processing event network-vif-plugged-4a5c580c-8f81-428a-9cfb-05b0ce894f36 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.873 186483 DEBUG nova.network.neutron [req-fb71ce47-85e6-41be-b8d2-a8955797a99a req-0a81ab61-91e1-4196-a58a-25d1979ddd08 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Updated VIF entry in instance network info cache for port 4a5c580c-8f81-428a-9cfb-05b0ce894f36. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.873 186483 DEBUG nova.network.neutron [req-fb71ce47-85e6-41be-b8d2-a8955797a99a req-0a81ab61-91e1-4196-a58a-25d1979ddd08 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Updating instance_info_cache with network_info: [{"id": "4a5c580c-8f81-428a-9cfb-05b0ce894f36", "address": "fa:16:3e:32:20:0b", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a5c580c-8f", "ovs_interfaceid": "4a5c580c-8f81-428a-9cfb-05b0ce894f36", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.889 186483 DEBUG oslo_concurrency.lockutils [req-fb71ce47-85e6-41be-b8d2-a8955797a99a req-0a81ab61-91e1-4196-a58a-25d1979ddd08 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-8b2c187c-d396-49a6-bc6b-a784ab66a468" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.951 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349605.951071, 8b2c187c-d396-49a6-bc6b-a784ab66a468 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.952 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] VM Started (Lifecycle Event)
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.955 186483 DEBUG nova.compute.manager [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.959 186483 DEBUG nova.virt.libvirt.driver [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.962 186483 INFO nova.virt.libvirt.driver [-] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Instance spawned successfully.
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.962 186483 DEBUG nova.virt.libvirt.driver [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.970 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.974 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.982 186483 DEBUG nova.virt.libvirt.driver [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.983 186483 DEBUG nova.virt.libvirt.driver [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.983 186483 DEBUG nova.virt.libvirt.driver [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.984 186483 DEBUG nova.virt.libvirt.driver [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.984 186483 DEBUG nova.virt.libvirt.driver [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.985 186483 DEBUG nova.virt.libvirt.driver [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.993 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.993 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349605.954687, 8b2c187c-d396-49a6-bc6b-a784ab66a468 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:33:25 compute-0 nova_compute[186479]: 2026-02-17 17:33:25.994 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] VM Paused (Lifecycle Event)
Feb 17 17:33:26 compute-0 nova_compute[186479]: 2026-02-17 17:33:26.024 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:33:26 compute-0 nova_compute[186479]: 2026-02-17 17:33:26.028 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349605.9577127, 8b2c187c-d396-49a6-bc6b-a784ab66a468 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:33:26 compute-0 nova_compute[186479]: 2026-02-17 17:33:26.028 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] VM Resumed (Lifecycle Event)
Feb 17 17:33:26 compute-0 nova_compute[186479]: 2026-02-17 17:33:26.040 186483 INFO nova.compute.manager [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Took 7.94 seconds to spawn the instance on the hypervisor.
Feb 17 17:33:26 compute-0 nova_compute[186479]: 2026-02-17 17:33:26.041 186483 DEBUG nova.compute.manager [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:33:26 compute-0 nova_compute[186479]: 2026-02-17 17:33:26.045 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:33:26 compute-0 nova_compute[186479]: 2026-02-17 17:33:26.052 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:33:26 compute-0 nova_compute[186479]: 2026-02-17 17:33:26.078 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:33:26 compute-0 nova_compute[186479]: 2026-02-17 17:33:26.105 186483 INFO nova.compute.manager [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Took 8.42 seconds to build instance.
Feb 17 17:33:26 compute-0 nova_compute[186479]: 2026-02-17 17:33:26.124 186483 DEBUG oslo_concurrency.lockutils [None req-7a2ae42c-dc9e-4001-9175-a547df59af67 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "8b2c187c-d396-49a6-bc6b-a784ab66a468" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.527s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:26 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:26.320 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '7a:bf:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:a1:c8:7d:f2:63'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:33:26 compute-0 nova_compute[186479]: 2026-02-17 17:33:26.321 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:26 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:26.322 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 17 17:33:27 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:27.326 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ee0cee2f-3200-4f1f-8903-57b18789347d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:33:27 compute-0 nova_compute[186479]: 2026-02-17 17:33:27.860 186483 DEBUG nova.compute.manager [req-e4c5649d-dade-464f-9774-3e0517672968 req-02b9fb35-f115-4d4f-9ea6-dc48c2defb61 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Received event network-vif-plugged-4a5c580c-8f81-428a-9cfb-05b0ce894f36 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:33:27 compute-0 nova_compute[186479]: 2026-02-17 17:33:27.861 186483 DEBUG oslo_concurrency.lockutils [req-e4c5649d-dade-464f-9774-3e0517672968 req-02b9fb35-f115-4d4f-9ea6-dc48c2defb61 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "8b2c187c-d396-49a6-bc6b-a784ab66a468-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:27 compute-0 nova_compute[186479]: 2026-02-17 17:33:27.861 186483 DEBUG oslo_concurrency.lockutils [req-e4c5649d-dade-464f-9774-3e0517672968 req-02b9fb35-f115-4d4f-9ea6-dc48c2defb61 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "8b2c187c-d396-49a6-bc6b-a784ab66a468-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:27 compute-0 nova_compute[186479]: 2026-02-17 17:33:27.862 186483 DEBUG oslo_concurrency.lockutils [req-e4c5649d-dade-464f-9774-3e0517672968 req-02b9fb35-f115-4d4f-9ea6-dc48c2defb61 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "8b2c187c-d396-49a6-bc6b-a784ab66a468-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:27 compute-0 nova_compute[186479]: 2026-02-17 17:33:27.862 186483 DEBUG nova.compute.manager [req-e4c5649d-dade-464f-9774-3e0517672968 req-02b9fb35-f115-4d4f-9ea6-dc48c2defb61 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] No waiting events found dispatching network-vif-plugged-4a5c580c-8f81-428a-9cfb-05b0ce894f36 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:33:27 compute-0 nova_compute[186479]: 2026-02-17 17:33:27.862 186483 WARNING nova.compute.manager [req-e4c5649d-dade-464f-9774-3e0517672968 req-02b9fb35-f115-4d4f-9ea6-dc48c2defb61 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Received unexpected event network-vif-plugged-4a5c580c-8f81-428a-9cfb-05b0ce894f36 for instance with vm_state active and task_state None.
Feb 17 17:33:28 compute-0 nova_compute[186479]: 2026-02-17 17:33:28.754 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:29 compute-0 nova_compute[186479]: 2026-02-17 17:33:29.518 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:30 compute-0 podman[217968]: 2026-02-17 17:33:30.734161566 +0000 UTC m=+0.070506907 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 17 17:33:33 compute-0 nova_compute[186479]: 2026-02-17 17:33:33.755 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:34 compute-0 nova_compute[186479]: 2026-02-17 17:33:34.520 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:38 compute-0 nova_compute[186479]: 2026-02-17 17:33:38.757 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:38 compute-0 ovn_controller[96568]: 2026-02-17T17:33:38Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:32:20:0b 10.100.0.27
Feb 17 17:33:38 compute-0 ovn_controller[96568]: 2026-02-17T17:33:38Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:32:20:0b 10.100.0.27
Feb 17 17:33:38 compute-0 podman[218009]: 2026-02-17 17:33:38.774900713 +0000 UTC m=+0.110014776 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 17 17:33:39 compute-0 nova_compute[186479]: 2026-02-17 17:33:39.521 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:42 compute-0 podman[218036]: 2026-02-17 17:33:42.708315186 +0000 UTC m=+0.049037679 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 17 17:33:43 compute-0 nova_compute[186479]: 2026-02-17 17:33:43.760 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:44 compute-0 nova_compute[186479]: 2026-02-17 17:33:44.522 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:46 compute-0 nova_compute[186479]: 2026-02-17 17:33:46.803 186483 DEBUG nova.compute.manager [req-011f3b8c-e572-4640-a455-364be0b7f6f9 req-66c70b82-1bdd-431f-a9bd-9a220a5bac44 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Received event network-changed-6440ba6f-03ca-4cca-9d42-4f3232f1cbaf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:33:46 compute-0 nova_compute[186479]: 2026-02-17 17:33:46.803 186483 DEBUG nova.compute.manager [req-011f3b8c-e572-4640-a455-364be0b7f6f9 req-66c70b82-1bdd-431f-a9bd-9a220a5bac44 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Refreshing instance network info cache due to event network-changed-6440ba6f-03ca-4cca-9d42-4f3232f1cbaf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:33:46 compute-0 nova_compute[186479]: 2026-02-17 17:33:46.804 186483 DEBUG oslo_concurrency.lockutils [req-011f3b8c-e572-4640-a455-364be0b7f6f9 req-66c70b82-1bdd-431f-a9bd-9a220a5bac44 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:33:46 compute-0 nova_compute[186479]: 2026-02-17 17:33:46.804 186483 DEBUG oslo_concurrency.lockutils [req-011f3b8c-e572-4640-a455-364be0b7f6f9 req-66c70b82-1bdd-431f-a9bd-9a220a5bac44 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:33:46 compute-0 nova_compute[186479]: 2026-02-17 17:33:46.804 186483 DEBUG nova.network.neutron [req-011f3b8c-e572-4640-a455-364be0b7f6f9 req-66c70b82-1bdd-431f-a9bd-9a220a5bac44 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Refreshing network info cache for port 6440ba6f-03ca-4cca-9d42-4f3232f1cbaf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:33:48 compute-0 nova_compute[186479]: 2026-02-17 17:33:48.015 186483 DEBUG nova.network.neutron [req-011f3b8c-e572-4640-a455-364be0b7f6f9 req-66c70b82-1bdd-431f-a9bd-9a220a5bac44 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Updated VIF entry in instance network info cache for port 6440ba6f-03ca-4cca-9d42-4f3232f1cbaf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:33:48 compute-0 nova_compute[186479]: 2026-02-17 17:33:48.016 186483 DEBUG nova.network.neutron [req-011f3b8c-e572-4640-a455-364be0b7f6f9 req-66c70b82-1bdd-431f-a9bd-9a220a5bac44 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Updating instance_info_cache with network_info: [{"id": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "address": "fa:16:3e:71:0b:7d", "network": {"id": "4bf50377-716d-42af-ab2c-e962c79e0a2f", "bridge": "br-int", "label": "tempest-network-smoke--1389548266", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd586c1db-aa", "ovs_interfaceid": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "address": "fa:16:3e:c1:86:a4", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6440ba6f-03", "ovs_interfaceid": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:33:48 compute-0 nova_compute[186479]: 2026-02-17 17:33:48.036 186483 DEBUG oslo_concurrency.lockutils [req-011f3b8c-e572-4640-a455-364be0b7f6f9 req-66c70b82-1bdd-431f-a9bd-9a220a5bac44 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:33:48 compute-0 podman[218060]: 2026-02-17 17:33:48.717977504 +0000 UTC m=+0.055810926 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter)
Feb 17 17:33:48 compute-0 nova_compute[186479]: 2026-02-17 17:33:48.762 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:49 compute-0 nova_compute[186479]: 2026-02-17 17:33:49.536 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.217 186483 DEBUG oslo_concurrency.lockutils [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "8b2c187c-d396-49a6-bc6b-a784ab66a468" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.218 186483 DEBUG oslo_concurrency.lockutils [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "8b2c187c-d396-49a6-bc6b-a784ab66a468" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.218 186483 DEBUG oslo_concurrency.lockutils [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "8b2c187c-d396-49a6-bc6b-a784ab66a468-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.218 186483 DEBUG oslo_concurrency.lockutils [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "8b2c187c-d396-49a6-bc6b-a784ab66a468-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.218 186483 DEBUG oslo_concurrency.lockutils [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "8b2c187c-d396-49a6-bc6b-a784ab66a468-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.219 186483 INFO nova.compute.manager [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Terminating instance
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.220 186483 DEBUG nova.compute.manager [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 17 17:33:51 compute-0 kernel: tap4a5c580c-8f (unregistering): left promiscuous mode
Feb 17 17:33:51 compute-0 NetworkManager[56323]: <info>  [1771349631.2504] device (tap4a5c580c-8f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 17 17:33:51 compute-0 ovn_controller[96568]: 2026-02-17T17:33:51Z|00107|binding|INFO|Releasing lport 4a5c580c-8f81-428a-9cfb-05b0ce894f36 from this chassis (sb_readonly=0)
Feb 17 17:33:51 compute-0 ovn_controller[96568]: 2026-02-17T17:33:51Z|00108|binding|INFO|Setting lport 4a5c580c-8f81-428a-9cfb-05b0ce894f36 down in Southbound
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.255 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:51 compute-0 ovn_controller[96568]: 2026-02-17T17:33:51Z|00109|binding|INFO|Removing iface tap4a5c580c-8f ovn-installed in OVS
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.257 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.261 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:51.263 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:20:0b 10.100.0.27'], port_security=['fa:16:3e:32:20:0b 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': '8b2c187c-d396-49a6-bc6b-a784ab66a468', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a9ae613-f3e7-4402-af22-4077e6ce992d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '663bd69a-2dae-4fb8-bff3-f4bda4cd5914', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0614ce48-1760-453d-9a6d-b76cf85eeab3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=4a5c580c-8f81-428a-9cfb-05b0ce894f36) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:33:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:51.264 105898 INFO neutron.agent.ovn.metadata.agent [-] Port 4a5c580c-8f81-428a-9cfb-05b0ce894f36 in datapath 0a9ae613-f3e7-4402-af22-4077e6ce992d unbound from our chassis
Feb 17 17:33:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:51.265 105898 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0a9ae613-f3e7-4402-af22-4077e6ce992d
Feb 17 17:33:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:51.274 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[f1b46649-fe01-4d5e-9b83-c3e7f1185381]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:51.288 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[3800a27d-ab9e-4d34-97cd-7b723c422968]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:51.291 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[e02a5d48-1036-4d86-9e60-0e1713c2a20f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:51.305 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[3aa6558b-766f-4c00-a0e9-b20157d0f406]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:51 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Deactivated successfully.
Feb 17 17:33:51 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Consumed 12.608s CPU time.
Feb 17 17:33:51 compute-0 systemd-machined[155877]: Machine qemu-7-instance-00000007 terminated.
Feb 17 17:33:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:51.319 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[77fae0dc-a9af-4bd2-bd9d-ecd32e61aa20]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a9ae613-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:38:6f:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 334273, 'reachable_time': 36640, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218094, 'error': None, 'target': 'ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:51.330 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[0d082ff1-8073-4822-94ef-393c3c044fac]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0a9ae613-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 334283, 'tstamp': 334283}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218095, 'error': None, 'target': 'ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tap0a9ae613-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 334286, 'tstamp': 334286}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218095, 'error': None, 'target': 'ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:51.331 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a9ae613-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.332 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.335 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:51.335 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a9ae613-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:33:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:51.336 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:33:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:51.336 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0a9ae613-f0, col_values=(('external_ids', {'iface-id': 'c51c13dd-c5ca-4593-a4ef-aa6abe6be705'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:33:51 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:51.336 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.435 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.440 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.455 186483 DEBUG nova.compute.manager [req-9a9f3e40-da7a-4deb-90f9-35212f1abc13 req-80485694-4064-423f-b516-99e776784fbc 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Received event network-vif-unplugged-4a5c580c-8f81-428a-9cfb-05b0ce894f36 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.455 186483 DEBUG oslo_concurrency.lockutils [req-9a9f3e40-da7a-4deb-90f9-35212f1abc13 req-80485694-4064-423f-b516-99e776784fbc 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "8b2c187c-d396-49a6-bc6b-a784ab66a468-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.456 186483 DEBUG oslo_concurrency.lockutils [req-9a9f3e40-da7a-4deb-90f9-35212f1abc13 req-80485694-4064-423f-b516-99e776784fbc 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "8b2c187c-d396-49a6-bc6b-a784ab66a468-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.456 186483 DEBUG oslo_concurrency.lockutils [req-9a9f3e40-da7a-4deb-90f9-35212f1abc13 req-80485694-4064-423f-b516-99e776784fbc 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "8b2c187c-d396-49a6-bc6b-a784ab66a468-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.456 186483 DEBUG nova.compute.manager [req-9a9f3e40-da7a-4deb-90f9-35212f1abc13 req-80485694-4064-423f-b516-99e776784fbc 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] No waiting events found dispatching network-vif-unplugged-4a5c580c-8f81-428a-9cfb-05b0ce894f36 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.457 186483 DEBUG nova.compute.manager [req-9a9f3e40-da7a-4deb-90f9-35212f1abc13 req-80485694-4064-423f-b516-99e776784fbc 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Received event network-vif-unplugged-4a5c580c-8f81-428a-9cfb-05b0ce894f36 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.471 186483 INFO nova.virt.libvirt.driver [-] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Instance destroyed successfully.
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.472 186483 DEBUG nova.objects.instance [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'resources' on Instance uuid 8b2c187c-d396-49a6-bc6b-a784ab66a468 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.483 186483 DEBUG nova.virt.libvirt.vif [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:33:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-332137290',display_name='tempest-TestNetworkBasicOps-server-332137290',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-332137290',id=7,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDqubP3j3ONueIiHFyzXtRM0URmPjGw9t6js/e76shcAk3x1UJBOXDWolRvWenawlzqoUbZHdPsKvxLR8GE4untTlcIllm0bgOiuTQPR4oxuyVDkaiZtq4Ltt50HEZzWQQ==',key_name='tempest-TestNetworkBasicOps-43454294',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:33:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-0x5t0m19',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:33:26Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=8b2c187c-d396-49a6-bc6b-a784ab66a468,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4a5c580c-8f81-428a-9cfb-05b0ce894f36", "address": "fa:16:3e:32:20:0b", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a5c580c-8f", "ovs_interfaceid": "4a5c580c-8f81-428a-9cfb-05b0ce894f36", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.484 186483 DEBUG nova.network.os_vif_util [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "4a5c580c-8f81-428a-9cfb-05b0ce894f36", "address": "fa:16:3e:32:20:0b", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a5c580c-8f", "ovs_interfaceid": "4a5c580c-8f81-428a-9cfb-05b0ce894f36", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.484 186483 DEBUG nova.network.os_vif_util [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:20:0b,bridge_name='br-int',has_traffic_filtering=True,id=4a5c580c-8f81-428a-9cfb-05b0ce894f36,network=Network(0a9ae613-f3e7-4402-af22-4077e6ce992d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a5c580c-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.485 186483 DEBUG os_vif [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:20:0b,bridge_name='br-int',has_traffic_filtering=True,id=4a5c580c-8f81-428a-9cfb-05b0ce894f36,network=Network(0a9ae613-f3e7-4402-af22-4077e6ce992d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a5c580c-8f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.486 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.486 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a5c580c-8f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.488 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.490 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.490 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.493 186483 INFO os_vif [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:20:0b,bridge_name='br-int',has_traffic_filtering=True,id=4a5c580c-8f81-428a-9cfb-05b0ce894f36,network=Network(0a9ae613-f3e7-4402-af22-4077e6ce992d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a5c580c-8f')
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.493 186483 INFO nova.virt.libvirt.driver [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Deleting instance files /var/lib/nova/instances/8b2c187c-d396-49a6-bc6b-a784ab66a468_del
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.494 186483 INFO nova.virt.libvirt.driver [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Deletion of /var/lib/nova/instances/8b2c187c-d396-49a6-bc6b-a784ab66a468_del complete
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.538 186483 INFO nova.compute.manager [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Took 0.32 seconds to destroy the instance on the hypervisor.
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.540 186483 DEBUG oslo.service.loopingcall [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.540 186483 DEBUG nova.compute.manager [-] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 17 17:33:51 compute-0 nova_compute[186479]: 2026-02-17 17:33:51.540 186483 DEBUG nova.network.neutron [-] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 17 17:33:52 compute-0 nova_compute[186479]: 2026-02-17 17:33:52.004 186483 DEBUG nova.network.neutron [-] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:33:52 compute-0 nova_compute[186479]: 2026-02-17 17:33:52.020 186483 INFO nova.compute.manager [-] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Took 0.48 seconds to deallocate network for instance.
Feb 17 17:33:52 compute-0 nova_compute[186479]: 2026-02-17 17:33:52.061 186483 DEBUG oslo_concurrency.lockutils [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:52 compute-0 nova_compute[186479]: 2026-02-17 17:33:52.061 186483 DEBUG oslo_concurrency.lockutils [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:52 compute-0 nova_compute[186479]: 2026-02-17 17:33:52.125 186483 DEBUG nova.compute.provider_tree [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:33:52 compute-0 nova_compute[186479]: 2026-02-17 17:33:52.140 186483 DEBUG nova.scheduler.client.report [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:33:52 compute-0 nova_compute[186479]: 2026-02-17 17:33:52.162 186483 DEBUG oslo_concurrency.lockutils [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:52 compute-0 nova_compute[186479]: 2026-02-17 17:33:52.184 186483 INFO nova.scheduler.client.report [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Deleted allocations for instance 8b2c187c-d396-49a6-bc6b-a784ab66a468
Feb 17 17:33:52 compute-0 nova_compute[186479]: 2026-02-17 17:33:52.245 186483 DEBUG oslo_concurrency.lockutils [None req-b3e6342e-2f03-442d-950f-c036fd6d4f19 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "8b2c187c-d396-49a6-bc6b-a784ab66a468" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.529 186483 DEBUG nova.compute.manager [req-c7f29ccc-34e3-4f03-9059-6821548c44fd req-b24fba46-1278-4523-a25a-c286e1ece78c 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Received event network-vif-plugged-4a5c580c-8f81-428a-9cfb-05b0ce894f36 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.530 186483 DEBUG oslo_concurrency.lockutils [req-c7f29ccc-34e3-4f03-9059-6821548c44fd req-b24fba46-1278-4523-a25a-c286e1ece78c 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "8b2c187c-d396-49a6-bc6b-a784ab66a468-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.530 186483 DEBUG oslo_concurrency.lockutils [req-c7f29ccc-34e3-4f03-9059-6821548c44fd req-b24fba46-1278-4523-a25a-c286e1ece78c 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "8b2c187c-d396-49a6-bc6b-a784ab66a468-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.530 186483 DEBUG oslo_concurrency.lockutils [req-c7f29ccc-34e3-4f03-9059-6821548c44fd req-b24fba46-1278-4523-a25a-c286e1ece78c 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "8b2c187c-d396-49a6-bc6b-a784ab66a468-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.531 186483 DEBUG nova.compute.manager [req-c7f29ccc-34e3-4f03-9059-6821548c44fd req-b24fba46-1278-4523-a25a-c286e1ece78c 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] No waiting events found dispatching network-vif-plugged-4a5c580c-8f81-428a-9cfb-05b0ce894f36 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.531 186483 WARNING nova.compute.manager [req-c7f29ccc-34e3-4f03-9059-6821548c44fd req-b24fba46-1278-4523-a25a-c286e1ece78c 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Received unexpected event network-vif-plugged-4a5c580c-8f81-428a-9cfb-05b0ce894f36 for instance with vm_state deleted and task_state None.
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.531 186483 DEBUG nova.compute.manager [req-c7f29ccc-34e3-4f03-9059-6821548c44fd req-b24fba46-1278-4523-a25a-c286e1ece78c 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Received event network-vif-deleted-4a5c580c-8f81-428a-9cfb-05b0ce894f36 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.618 186483 DEBUG oslo_concurrency.lockutils [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "interface-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-6440ba6f-03ca-4cca-9d42-4f3232f1cbaf" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.619 186483 DEBUG oslo_concurrency.lockutils [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "interface-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-6440ba6f-03ca-4cca-9d42-4f3232f1cbaf" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.635 186483 DEBUG nova.objects.instance [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'flavor' on Instance uuid 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.655 186483 DEBUG nova.virt.libvirt.vif [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:32:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2104168838',display_name='tempest-TestNetworkBasicOps-server-2104168838',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2104168838',id=6,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEGSEk4tvtHx+1wxTUolBp5FEKwhcwrFkbyGug9B2dJQk+Gxis2F5tzoHoKU+6EzBSJMG7knNQ1i0UckkdjQ1ANkfH3T0OzsYfzd7j6wv5dyVb9QJBpDJuL+Kt9oifXUkw==',key_name='tempest-TestNetworkBasicOps-347025944',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:32:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-vfuyzqzc',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:32:42Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=3f9e4572-dddc-48ee-8ecf-d52e23be5e3a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "address": "fa:16:3e:c1:86:a4", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6440ba6f-03", "ovs_interfaceid": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.656 186483 DEBUG nova.network.os_vif_util [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "address": "fa:16:3e:c1:86:a4", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6440ba6f-03", "ovs_interfaceid": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.657 186483 DEBUG nova.network.os_vif_util [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c1:86:a4,bridge_name='br-int',has_traffic_filtering=True,id=6440ba6f-03ca-4cca-9d42-4f3232f1cbaf,network=Network(0a9ae613-f3e7-4402-af22-4077e6ce992d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6440ba6f-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.661 186483 DEBUG nova.virt.libvirt.guest [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:c1:86:a4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6440ba6f-03"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.664 186483 DEBUG nova.virt.libvirt.guest [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:c1:86:a4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6440ba6f-03"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.668 186483 DEBUG nova.virt.libvirt.driver [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Attempting to detach device tap6440ba6f-03 from instance 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.668 186483 DEBUG nova.virt.libvirt.guest [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] detach device xml: <interface type="ethernet">
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <mac address="fa:16:3e:c1:86:a4"/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <model type="virtio"/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <driver name="vhost" rx_queue_size="512"/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <mtu size="1442"/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <target dev="tap6440ba6f-03"/>
Feb 17 17:33:53 compute-0 nova_compute[186479]: </interface>
Feb 17 17:33:53 compute-0 nova_compute[186479]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.684 186483 DEBUG nova.virt.libvirt.guest [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:c1:86:a4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6440ba6f-03"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.689 186483 DEBUG nova.virt.libvirt.guest [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:c1:86:a4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6440ba6f-03"/></interface>not found in domain: <domain type='kvm' id='6'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <name>instance-00000006</name>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <uuid>3f9e4572-dddc-48ee-8ecf-d52e23be5e3a</uuid>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <metadata>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <nova:name>tempest-TestNetworkBasicOps-server-2104168838</nova:name>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <nova:creationTime>2026-02-17 17:33:07</nova:creationTime>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <nova:flavor name="m1.nano">
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:memory>128</nova:memory>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:disk>1</nova:disk>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:swap>0</nova:swap>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:vcpus>1</nova:vcpus>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </nova:flavor>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <nova:owner>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </nova:owner>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <nova:ports>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:port uuid="d586c1db-aaaa-45aa-8c55-c3a66e387a6e">
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </nova:port>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:port uuid="6440ba6f-03ca-4cca-9d42-4f3232f1cbaf">
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <nova:ip type="fixed" address="10.100.0.28" ipVersion="4"/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </nova:port>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </nova:ports>
Feb 17 17:33:53 compute-0 nova_compute[186479]: </nova:instance>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </metadata>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <memory unit='KiB'>131072</memory>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <currentMemory unit='KiB'>131072</currentMemory>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <vcpu placement='static'>1</vcpu>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <resource>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <partition>/machine</partition>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </resource>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <sysinfo type='smbios'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <system>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <entry name='manufacturer'>RDO</entry>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <entry name='product'>OpenStack Compute</entry>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <entry name='serial'>3f9e4572-dddc-48ee-8ecf-d52e23be5e3a</entry>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <entry name='uuid'>3f9e4572-dddc-48ee-8ecf-d52e23be5e3a</entry>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <entry name='family'>Virtual Machine</entry>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </system>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </sysinfo>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <os>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <boot dev='hd'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <smbios mode='sysinfo'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </os>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <features>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <acpi/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <apic/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <vmcoreinfo state='on'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </features>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <cpu mode='custom' match='exact' check='full'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <model fallback='forbid'>EPYC-Rome</model>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <vendor>AMD</vendor>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='x2apic'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='tsc-deadline'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='hypervisor'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='tsc_adjust'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='spec-ctrl'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='stibp'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='ssbd'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='cmp_legacy'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='overflow-recov'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='succor'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='ibrs'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='amd-ssbd'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='virt-ssbd'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='lbrv'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='tsc-scale'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='vmcb-clean'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='flushbyasid'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='pause-filter'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='pfthreshold'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='svme-addr-chk'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='lfence-always-serializing'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='xsaves'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='svm'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='topoext'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='npt'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='nrip-save'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <clock offset='utc'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <timer name='pit' tickpolicy='delay'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <timer name='rtc' tickpolicy='catchup'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <timer name='hpet' present='no'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </clock>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <on_poweroff>destroy</on_poweroff>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <on_reboot>restart</on_reboot>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <on_crash>destroy</on_crash>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <disk type='file' device='disk'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <driver name='qemu' type='qcow2' cache='none'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <source file='/var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk' index='2'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <backingStore type='file' index='3'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:         <format type='raw'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:         <source file='/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:         <backingStore/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       </backingStore>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target dev='vda' bus='virtio'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='virtio-disk0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <disk type='file' device='cdrom'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <driver name='qemu' type='raw' cache='none'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <source file='/var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.config' index='1'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <backingStore/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target dev='sda' bus='sata'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <readonly/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='sata0-0-0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='0' model='pcie-root'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pcie.0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='1' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='1' port='0x10'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.1'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='2' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='2' port='0x11'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.2'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='3' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='3' port='0x12'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.3'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='4' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='4' port='0x13'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.4'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='5' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='5' port='0x14'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.5'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='6' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='6' port='0x15'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.6'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='7' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='7' port='0x16'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.7'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='8' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='8' port='0x17'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.8'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='9' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='9' port='0x18'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.9'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='10' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='10' port='0x19'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.10'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='11' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='11' port='0x1a'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.11'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='12' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='12' port='0x1b'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.12'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='13' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='13' port='0x1c'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.13'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='14' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='14' port='0x1d'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.14'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='15' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='15' port='0x1e'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.15'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='16' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='16' port='0x1f'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.16'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='17' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='17' port='0x20'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.17'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='18' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='18' port='0x21'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.18'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='19' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='19' port='0x22'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.19'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='20' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='20' port='0x23'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.20'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='21' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='21' port='0x24'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.21'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='22' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='22' port='0x25'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.22'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='23' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='23' port='0x26'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.23'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='24' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='24' port='0x27'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.24'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='25' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='25' port='0x28'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.25'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-pci-bridge'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.26'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='usb' index='0' model='piix3-uhci'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='usb'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='sata' index='0'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='ide'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <interface type='ethernet'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <mac address='fa:16:3e:71:0b:7d'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target dev='tapd586c1db-aa'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model type='virtio'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <driver name='vhost' rx_queue_size='512'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <mtu size='1442'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='net0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <interface type='ethernet'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <mac address='fa:16:3e:c1:86:a4'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target dev='tap6440ba6f-03'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model type='virtio'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <driver name='vhost' rx_queue_size='512'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <mtu size='1442'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='net1'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <serial type='pty'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <source path='/dev/pts/0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <log file='/var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/console.log' append='off'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target type='isa-serial' port='0'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:         <model name='isa-serial'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       </target>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='serial0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </serial>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <console type='pty' tty='/dev/pts/0'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <source path='/dev/pts/0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <log file='/var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/console.log' append='off'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target type='serial' port='0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='serial0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </console>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <input type='tablet' bus='usb'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='input0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='usb' bus='0' port='1'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </input>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <input type='mouse' bus='ps2'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='input1'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </input>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <input type='keyboard' bus='ps2'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='input2'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </input>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <listen type='address' address='::0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </graphics>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <audio id='1' type='none'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <video>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model type='virtio' heads='1' primary='yes'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='video0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </video>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <watchdog model='itco' action='reset'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='watchdog0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </watchdog>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <memballoon model='virtio'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <stats period='10'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='balloon0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </memballoon>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <rng model='virtio'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <backend model='random'>/dev/urandom</backend>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='rng0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <label>system_u:system_r:svirt_t:s0:c258,c501</label>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c258,c501</imagelabel>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </seclabel>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <label>+107:+107</label>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <imagelabel>+107:+107</imagelabel>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </seclabel>
Feb 17 17:33:53 compute-0 nova_compute[186479]: </domain>
Feb 17 17:33:53 compute-0 nova_compute[186479]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.689 186483 INFO nova.virt.libvirt.driver [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully detached device tap6440ba6f-03 from instance 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a from the persistent domain config.
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.690 186483 DEBUG nova.virt.libvirt.driver [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] (1/8): Attempting to detach device tap6440ba6f-03 with device alias net1 from instance 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.690 186483 DEBUG nova.virt.libvirt.guest [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] detach device xml: <interface type="ethernet">
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <mac address="fa:16:3e:c1:86:a4"/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <model type="virtio"/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <driver name="vhost" rx_queue_size="512"/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <mtu size="1442"/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <target dev="tap6440ba6f-03"/>
Feb 17 17:33:53 compute-0 nova_compute[186479]: </interface>
Feb 17 17:33:53 compute-0 nova_compute[186479]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.764 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:53 compute-0 kernel: tap6440ba6f-03 (unregistering): left promiscuous mode
Feb 17 17:33:53 compute-0 NetworkManager[56323]: <info>  [1771349633.7864] device (tap6440ba6f-03): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 17 17:33:53 compute-0 ovn_controller[96568]: 2026-02-17T17:33:53Z|00110|binding|INFO|Releasing lport 6440ba6f-03ca-4cca-9d42-4f3232f1cbaf from this chassis (sb_readonly=0)
Feb 17 17:33:53 compute-0 ovn_controller[96568]: 2026-02-17T17:33:53Z|00111|binding|INFO|Setting lport 6440ba6f-03ca-4cca-9d42-4f3232f1cbaf down in Southbound
Feb 17 17:33:53 compute-0 ovn_controller[96568]: 2026-02-17T17:33:53Z|00112|binding|INFO|Removing iface tap6440ba6f-03 ovn-installed in OVS
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.788 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.794 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:53 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:53.796 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:86:a4 10.100.0.28', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.28/28', 'neutron:device_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a9ae613-f3e7-4402-af22-4077e6ce992d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0614ce48-1760-453d-9a6d-b76cf85eeab3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=6440ba6f-03ca-4cca-9d42-4f3232f1cbaf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:33:53 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:53.797 105898 INFO neutron.agent.ovn.metadata.agent [-] Port 6440ba6f-03ca-4cca-9d42-4f3232f1cbaf in datapath 0a9ae613-f3e7-4402-af22-4077e6ce992d unbound from our chassis
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.798 186483 DEBUG nova.virt.libvirt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Received event <DeviceRemovedEvent: 1771349633.7979243, 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Feb 17 17:33:53 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:53.798 105898 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a9ae613-f3e7-4402-af22-4077e6ce992d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 17 17:33:53 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:53.799 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[69ddef5f-5ecc-44cd-84e2-1247da31d01b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:53 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:53.799 105898 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d namespace which is not needed anymore
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.800 186483 DEBUG nova.virt.libvirt.driver [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Start waiting for the detach event from libvirt for device tap6440ba6f-03 with device alias net1 for instance 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.801 186483 DEBUG nova.virt.libvirt.guest [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:c1:86:a4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6440ba6f-03"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.804 186483 DEBUG nova.virt.libvirt.guest [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:c1:86:a4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6440ba6f-03"/></interface>not found in domain: <domain type='kvm' id='6'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <name>instance-00000006</name>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <uuid>3f9e4572-dddc-48ee-8ecf-d52e23be5e3a</uuid>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <metadata>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <nova:name>tempest-TestNetworkBasicOps-server-2104168838</nova:name>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <nova:creationTime>2026-02-17 17:33:07</nova:creationTime>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <nova:flavor name="m1.nano">
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:memory>128</nova:memory>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:disk>1</nova:disk>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:swap>0</nova:swap>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:vcpus>1</nova:vcpus>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </nova:flavor>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <nova:owner>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </nova:owner>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <nova:ports>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:port uuid="d586c1db-aaaa-45aa-8c55-c3a66e387a6e">
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </nova:port>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:port uuid="6440ba6f-03ca-4cca-9d42-4f3232f1cbaf">
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <nova:ip type="fixed" address="10.100.0.28" ipVersion="4"/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </nova:port>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </nova:ports>
Feb 17 17:33:53 compute-0 nova_compute[186479]: </nova:instance>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </metadata>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <memory unit='KiB'>131072</memory>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <currentMemory unit='KiB'>131072</currentMemory>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <vcpu placement='static'>1</vcpu>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <resource>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <partition>/machine</partition>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </resource>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <sysinfo type='smbios'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <system>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <entry name='manufacturer'>RDO</entry>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <entry name='product'>OpenStack Compute</entry>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <entry name='serial'>3f9e4572-dddc-48ee-8ecf-d52e23be5e3a</entry>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <entry name='uuid'>3f9e4572-dddc-48ee-8ecf-d52e23be5e3a</entry>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <entry name='family'>Virtual Machine</entry>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </system>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </sysinfo>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <os>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <boot dev='hd'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <smbios mode='sysinfo'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </os>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <features>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <acpi/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <apic/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <vmcoreinfo state='on'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </features>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <cpu mode='custom' match='exact' check='full'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <model fallback='forbid'>EPYC-Rome</model>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <vendor>AMD</vendor>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='x2apic'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='tsc-deadline'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='hypervisor'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='tsc_adjust'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='spec-ctrl'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='stibp'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='ssbd'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='cmp_legacy'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='overflow-recov'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='succor'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='ibrs'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='amd-ssbd'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='virt-ssbd'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='lbrv'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='tsc-scale'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='vmcb-clean'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='flushbyasid'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='pause-filter'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='pfthreshold'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='svme-addr-chk'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='lfence-always-serializing'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='xsaves'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='svm'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='require' name='topoext'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='npt'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <feature policy='disable' name='nrip-save'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <clock offset='utc'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <timer name='pit' tickpolicy='delay'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <timer name='rtc' tickpolicy='catchup'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <timer name='hpet' present='no'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </clock>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <on_poweroff>destroy</on_poweroff>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <on_reboot>restart</on_reboot>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <on_crash>destroy</on_crash>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <disk type='file' device='disk'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <driver name='qemu' type='qcow2' cache='none'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <source file='/var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk' index='2'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <backingStore type='file' index='3'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:         <format type='raw'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:         <source file='/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:         <backingStore/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       </backingStore>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target dev='vda' bus='virtio'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='virtio-disk0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <disk type='file' device='cdrom'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <driver name='qemu' type='raw' cache='none'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <source file='/var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.config' index='1'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <backingStore/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target dev='sda' bus='sata'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <readonly/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='sata0-0-0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='0' model='pcie-root'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pcie.0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='1' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='1' port='0x10'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.1'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='2' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='2' port='0x11'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.2'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='3' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='3' port='0x12'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.3'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='4' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='4' port='0x13'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.4'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='5' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='5' port='0x14'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.5'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='6' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='6' port='0x15'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.6'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='7' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='7' port='0x16'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.7'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='8' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='8' port='0x17'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.8'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='9' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='9' port='0x18'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.9'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='10' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='10' port='0x19'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.10'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='11' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='11' port='0x1a'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.11'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='12' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='12' port='0x1b'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.12'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='13' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='13' port='0x1c'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.13'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='14' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='14' port='0x1d'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.14'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='15' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='15' port='0x1e'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.15'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='16' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='16' port='0x1f'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.16'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='17' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='17' port='0x20'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.17'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='18' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='18' port='0x21'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.18'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='19' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='19' port='0x22'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.19'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='20' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='20' port='0x23'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.20'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='21' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='21' port='0x24'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.21'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='22' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='22' port='0x25'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.22'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='23' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='23' port='0x26'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.23'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='24' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='24' port='0x27'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.24'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='25' model='pcie-root-port'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target chassis='25' port='0x28'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.25'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model name='pcie-pci-bridge'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='pci.26'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='usb' index='0' model='piix3-uhci'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='usb'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <controller type='sata' index='0'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='ide'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <interface type='ethernet'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <mac address='fa:16:3e:71:0b:7d'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target dev='tapd586c1db-aa'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model type='virtio'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <driver name='vhost' rx_queue_size='512'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <mtu size='1442'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='net0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <serial type='pty'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <source path='/dev/pts/0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <log file='/var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/console.log' append='off'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target type='isa-serial' port='0'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:         <model name='isa-serial'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       </target>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='serial0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </serial>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <console type='pty' tty='/dev/pts/0'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <source path='/dev/pts/0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <log file='/var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/console.log' append='off'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <target type='serial' port='0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='serial0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </console>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <input type='tablet' bus='usb'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='input0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='usb' bus='0' port='1'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </input>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <input type='mouse' bus='ps2'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='input1'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </input>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <input type='keyboard' bus='ps2'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='input2'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </input>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <listen type='address' address='::0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </graphics>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <audio id='1' type='none'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <video>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <model type='virtio' heads='1' primary='yes'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='video0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </video>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <watchdog model='itco' action='reset'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='watchdog0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </watchdog>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <memballoon model='virtio'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <stats period='10'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='balloon0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </memballoon>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <rng model='virtio'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <backend model='random'>/dev/urandom</backend>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <alias name='rng0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <label>system_u:system_r:svirt_t:s0:c258,c501</label>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c258,c501</imagelabel>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </seclabel>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <label>+107:+107</label>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <imagelabel>+107:+107</imagelabel>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </seclabel>
Feb 17 17:33:53 compute-0 nova_compute[186479]: </domain>
Feb 17 17:33:53 compute-0 nova_compute[186479]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.804 186483 INFO nova.virt.libvirt.driver [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully detached device tap6440ba6f-03 from instance 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a from the live domain config.
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.805 186483 DEBUG nova.virt.libvirt.vif [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:32:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2104168838',display_name='tempest-TestNetworkBasicOps-server-2104168838',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2104168838',id=6,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEGSEk4tvtHx+1wxTUolBp5FEKwhcwrFkbyGug9B2dJQk+Gxis2F5tzoHoKU+6EzBSJMG7knNQ1i0UckkdjQ1ANkfH3T0OzsYfzd7j6wv5dyVb9QJBpDJuL+Kt9oifXUkw==',key_name='tempest-TestNetworkBasicOps-347025944',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:32:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-vfuyzqzc',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:32:42Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=3f9e4572-dddc-48ee-8ecf-d52e23be5e3a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "address": "fa:16:3e:c1:86:a4", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6440ba6f-03", "ovs_interfaceid": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.805 186483 DEBUG nova.network.os_vif_util [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "address": "fa:16:3e:c1:86:a4", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6440ba6f-03", "ovs_interfaceid": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.806 186483 DEBUG nova.network.os_vif_util [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c1:86:a4,bridge_name='br-int',has_traffic_filtering=True,id=6440ba6f-03ca-4cca-9d42-4f3232f1cbaf,network=Network(0a9ae613-f3e7-4402-af22-4077e6ce992d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6440ba6f-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.806 186483 DEBUG os_vif [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:86:a4,bridge_name='br-int',has_traffic_filtering=True,id=6440ba6f-03ca-4cca-9d42-4f3232f1cbaf,network=Network(0a9ae613-f3e7-4402-af22-4077e6ce992d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6440ba6f-03') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.807 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.808 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6440ba6f-03, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.809 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.811 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.814 186483 INFO os_vif [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:86:a4,bridge_name='br-int',has_traffic_filtering=True,id=6440ba6f-03ca-4cca-9d42-4f3232f1cbaf,network=Network(0a9ae613-f3e7-4402-af22-4077e6ce992d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6440ba6f-03')
Feb 17 17:33:53 compute-0 nova_compute[186479]: 2026-02-17 17:33:53.815 186483 DEBUG nova.virt.libvirt.guest [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <nova:name>tempest-TestNetworkBasicOps-server-2104168838</nova:name>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <nova:creationTime>2026-02-17 17:33:53</nova:creationTime>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <nova:flavor name="m1.nano">
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:memory>128</nova:memory>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:disk>1</nova:disk>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:swap>0</nova:swap>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:vcpus>1</nova:vcpus>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </nova:flavor>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <nova:owner>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </nova:owner>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   <nova:ports>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     <nova:port uuid="d586c1db-aaaa-45aa-8c55-c3a66e387a6e">
Feb 17 17:33:53 compute-0 nova_compute[186479]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 17 17:33:53 compute-0 nova_compute[186479]:     </nova:port>
Feb 17 17:33:53 compute-0 nova_compute[186479]:   </nova:ports>
Feb 17 17:33:53 compute-0 nova_compute[186479]: </nova:instance>
Feb 17 17:33:53 compute-0 nova_compute[186479]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Feb 17 17:33:53 compute-0 neutron-haproxy-ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d[217787]: [NOTICE]   (217810) : haproxy version is 2.8.14-c23fe91
Feb 17 17:33:53 compute-0 neutron-haproxy-ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d[217787]: [NOTICE]   (217810) : path to executable is /usr/sbin/haproxy
Feb 17 17:33:53 compute-0 neutron-haproxy-ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d[217787]: [WARNING]  (217810) : Exiting Master process...
Feb 17 17:33:53 compute-0 neutron-haproxy-ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d[217787]: [WARNING]  (217810) : Exiting Master process...
Feb 17 17:33:53 compute-0 neutron-haproxy-ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d[217787]: [ALERT]    (217810) : Current worker (217815) exited with code 143 (Terminated)
Feb 17 17:33:53 compute-0 neutron-haproxy-ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d[217787]: [WARNING]  (217810) : All workers exited. Exiting... (0)
Feb 17 17:33:53 compute-0 systemd[1]: libpod-703d72bf948ad650ff1719590c4a4aff8f1d05d6e4e19607b34b61e27173d933.scope: Deactivated successfully.
Feb 17 17:33:53 compute-0 podman[218135]: 2026-02-17 17:33:53.913190936 +0000 UTC m=+0.042863967 container died 703d72bf948ad650ff1719590c4a4aff8f1d05d6e4e19607b34b61e27173d933 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 17 17:33:53 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-703d72bf948ad650ff1719590c4a4aff8f1d05d6e4e19607b34b61e27173d933-userdata-shm.mount: Deactivated successfully.
Feb 17 17:33:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-310c89dc0e85e6a75d21a9dbed2cc39edab91408fb2ac88656c51957472cd789-merged.mount: Deactivated successfully.
Feb 17 17:33:53 compute-0 podman[218135]: 2026-02-17 17:33:53.953611452 +0000 UTC m=+0.083284423 container cleanup 703d72bf948ad650ff1719590c4a4aff8f1d05d6e4e19607b34b61e27173d933 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:33:53 compute-0 systemd[1]: libpod-conmon-703d72bf948ad650ff1719590c4a4aff8f1d05d6e4e19607b34b61e27173d933.scope: Deactivated successfully.
Feb 17 17:33:54 compute-0 podman[218162]: 2026-02-17 17:33:54.000491127 +0000 UTC m=+0.034015449 container remove 703d72bf948ad650ff1719590c4a4aff8f1d05d6e4e19607b34b61e27173d933 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 17 17:33:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:54.004 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[cc501bf2-fff6-468f-80ad-678dcdf9f411]: (4, ('Tue Feb 17 05:33:53 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d (703d72bf948ad650ff1719590c4a4aff8f1d05d6e4e19607b34b61e27173d933)\n703d72bf948ad650ff1719590c4a4aff8f1d05d6e4e19607b34b61e27173d933\nTue Feb 17 05:33:53 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d (703d72bf948ad650ff1719590c4a4aff8f1d05d6e4e19607b34b61e27173d933)\n703d72bf948ad650ff1719590c4a4aff8f1d05d6e4e19607b34b61e27173d933\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:54.006 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[7a7740c2-f01f-4008-bc3e-457c72b8e409]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:54.006 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a9ae613-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:33:54 compute-0 nova_compute[186479]: 2026-02-17 17:33:54.008 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:54 compute-0 kernel: tap0a9ae613-f0: left promiscuous mode
Feb 17 17:33:54 compute-0 nova_compute[186479]: 2026-02-17 17:33:54.013 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:54 compute-0 nova_compute[186479]: 2026-02-17 17:33:54.013 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:54.017 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[42cbaa3d-660f-4601-8f8e-1598f70657d3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:54.031 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[e0bffd7d-3e7b-4cc5-ad4a-17c763a05d84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:54.033 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[3e86e55d-651b-4664-92c7-29f5bac8cfbb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:54.046 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[0690d04b-2a1a-4585-acf3-682e3a0e580c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 334268, 'reachable_time': 43842, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218182, 'error': None, 'target': 'ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:54 compute-0 systemd[1]: run-netns-ovnmeta\x2d0a9ae613\x2df3e7\x2d4402\x2daf22\x2d4077e6ce992d.mount: Deactivated successfully.
Feb 17 17:33:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:54.049 106424 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0a9ae613-f3e7-4402-af22-4077e6ce992d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 17 17:33:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:54.049 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[68bdec22-3b2f-4d54-8966-06bf2afbc4aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:54 compute-0 nova_compute[186479]: 2026-02-17 17:33:54.518 186483 DEBUG nova.compute.manager [req-6b853605-a2ce-4f13-8c51-73a1b20b74d9 req-10ffa4c6-1f31-42c3-b32c-e78e1800df84 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Received event network-vif-unplugged-6440ba6f-03ca-4cca-9d42-4f3232f1cbaf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:33:54 compute-0 nova_compute[186479]: 2026-02-17 17:33:54.519 186483 DEBUG oslo_concurrency.lockutils [req-6b853605-a2ce-4f13-8c51-73a1b20b74d9 req-10ffa4c6-1f31-42c3-b32c-e78e1800df84 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:54 compute-0 nova_compute[186479]: 2026-02-17 17:33:54.519 186483 DEBUG oslo_concurrency.lockutils [req-6b853605-a2ce-4f13-8c51-73a1b20b74d9 req-10ffa4c6-1f31-42c3-b32c-e78e1800df84 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:54 compute-0 nova_compute[186479]: 2026-02-17 17:33:54.519 186483 DEBUG oslo_concurrency.lockutils [req-6b853605-a2ce-4f13-8c51-73a1b20b74d9 req-10ffa4c6-1f31-42c3-b32c-e78e1800df84 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:54 compute-0 nova_compute[186479]: 2026-02-17 17:33:54.519 186483 DEBUG nova.compute.manager [req-6b853605-a2ce-4f13-8c51-73a1b20b74d9 req-10ffa4c6-1f31-42c3-b32c-e78e1800df84 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] No waiting events found dispatching network-vif-unplugged-6440ba6f-03ca-4cca-9d42-4f3232f1cbaf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:33:54 compute-0 nova_compute[186479]: 2026-02-17 17:33:54.520 186483 WARNING nova.compute.manager [req-6b853605-a2ce-4f13-8c51-73a1b20b74d9 req-10ffa4c6-1f31-42c3-b32c-e78e1800df84 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Received unexpected event network-vif-unplugged-6440ba6f-03ca-4cca-9d42-4f3232f1cbaf for instance with vm_state active and task_state None.
Feb 17 17:33:54 compute-0 nova_compute[186479]: 2026-02-17 17:33:54.734 186483 DEBUG oslo_concurrency.lockutils [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:33:54 compute-0 nova_compute[186479]: 2026-02-17 17:33:54.735 186483 DEBUG oslo_concurrency.lockutils [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquired lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:33:54 compute-0 nova_compute[186479]: 2026-02-17 17:33:54.736 186483 DEBUG nova.network.neutron [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.625 186483 DEBUG nova.compute.manager [req-d38a9764-4384-4a06-ad08-2ab78259ab4d req-7cca1091-290e-4b19-bb06-b686dd1f86a4 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Received event network-vif-deleted-6440ba6f-03ca-4cca-9d42-4f3232f1cbaf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.626 186483 INFO nova.compute.manager [req-d38a9764-4384-4a06-ad08-2ab78259ab4d req-7cca1091-290e-4b19-bb06-b686dd1f86a4 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Neutron deleted interface 6440ba6f-03ca-4cca-9d42-4f3232f1cbaf; detaching it from the instance and deleting it from the info cache
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.626 186483 DEBUG nova.network.neutron [req-d38a9764-4384-4a06-ad08-2ab78259ab4d req-7cca1091-290e-4b19-bb06-b686dd1f86a4 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Updating instance_info_cache with network_info: [{"id": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "address": "fa:16:3e:71:0b:7d", "network": {"id": "4bf50377-716d-42af-ab2c-e962c79e0a2f", "bridge": "br-int", "label": "tempest-network-smoke--1389548266", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd586c1db-aa", "ovs_interfaceid": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.680 186483 DEBUG nova.objects.instance [req-d38a9764-4384-4a06-ad08-2ab78259ab4d req-7cca1091-290e-4b19-bb06-b686dd1f86a4 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lazy-loading 'system_metadata' on Instance uuid 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.702 186483 DEBUG nova.objects.instance [req-d38a9764-4384-4a06-ad08-2ab78259ab4d req-7cca1091-290e-4b19-bb06-b686dd1f86a4 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lazy-loading 'flavor' on Instance uuid 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:33:55 compute-0 podman[218183]: 2026-02-17 17:33:55.706601656 +0000 UTC m=+0.043511002 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.719 186483 DEBUG nova.virt.libvirt.vif [req-d38a9764-4384-4a06-ad08-2ab78259ab4d req-7cca1091-290e-4b19-bb06-b686dd1f86a4 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:32:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2104168838',display_name='tempest-TestNetworkBasicOps-server-2104168838',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2104168838',id=6,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEGSEk4tvtHx+1wxTUolBp5FEKwhcwrFkbyGug9B2dJQk+Gxis2F5tzoHoKU+6EzBSJMG7knNQ1i0UckkdjQ1ANkfH3T0OzsYfzd7j6wv5dyVb9QJBpDJuL+Kt9oifXUkw==',key_name='tempest-TestNetworkBasicOps-347025944',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:32:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-vfuyzqzc',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:32:42Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=3f9e4572-dddc-48ee-8ecf-d52e23be5e3a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "address": "fa:16:3e:c1:86:a4", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6440ba6f-03", "ovs_interfaceid": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.719 186483 DEBUG nova.network.os_vif_util [req-d38a9764-4384-4a06-ad08-2ab78259ab4d req-7cca1091-290e-4b19-bb06-b686dd1f86a4 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Converting VIF {"id": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "address": "fa:16:3e:c1:86:a4", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6440ba6f-03", "ovs_interfaceid": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.720 186483 DEBUG nova.network.os_vif_util [req-d38a9764-4384-4a06-ad08-2ab78259ab4d req-7cca1091-290e-4b19-bb06-b686dd1f86a4 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c1:86:a4,bridge_name='br-int',has_traffic_filtering=True,id=6440ba6f-03ca-4cca-9d42-4f3232f1cbaf,network=Network(0a9ae613-f3e7-4402-af22-4077e6ce992d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6440ba6f-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.723 186483 DEBUG nova.virt.libvirt.guest [req-d38a9764-4384-4a06-ad08-2ab78259ab4d req-7cca1091-290e-4b19-bb06-b686dd1f86a4 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:c1:86:a4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6440ba6f-03"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.726 186483 DEBUG nova.virt.libvirt.guest [req-d38a9764-4384-4a06-ad08-2ab78259ab4d req-7cca1091-290e-4b19-bb06-b686dd1f86a4 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:c1:86:a4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6440ba6f-03"/></interface>not found in domain: <domain type='kvm' id='6'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <name>instance-00000006</name>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <uuid>3f9e4572-dddc-48ee-8ecf-d52e23be5e3a</uuid>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <metadata>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <nova:name>tempest-TestNetworkBasicOps-server-2104168838</nova:name>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <nova:creationTime>2026-02-17 17:33:53</nova:creationTime>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <nova:flavor name="m1.nano">
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:memory>128</nova:memory>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:disk>1</nova:disk>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:swap>0</nova:swap>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:vcpus>1</nova:vcpus>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </nova:flavor>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <nova:owner>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </nova:owner>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <nova:ports>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:port uuid="d586c1db-aaaa-45aa-8c55-c3a66e387a6e">
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </nova:port>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </nova:ports>
Feb 17 17:33:55 compute-0 nova_compute[186479]: </nova:instance>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </metadata>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <memory unit='KiB'>131072</memory>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <currentMemory unit='KiB'>131072</currentMemory>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <vcpu placement='static'>1</vcpu>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <resource>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <partition>/machine</partition>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </resource>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <sysinfo type='smbios'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <system>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <entry name='manufacturer'>RDO</entry>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <entry name='product'>OpenStack Compute</entry>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <entry name='serial'>3f9e4572-dddc-48ee-8ecf-d52e23be5e3a</entry>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <entry name='uuid'>3f9e4572-dddc-48ee-8ecf-d52e23be5e3a</entry>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <entry name='family'>Virtual Machine</entry>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </system>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </sysinfo>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <os>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <boot dev='hd'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <smbios mode='sysinfo'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </os>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <features>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <acpi/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <apic/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <vmcoreinfo state='on'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </features>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <cpu mode='custom' match='exact' check='full'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <model fallback='forbid'>EPYC-Rome</model>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <vendor>AMD</vendor>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='x2apic'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='tsc-deadline'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='hypervisor'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='tsc_adjust'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='spec-ctrl'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='stibp'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='ssbd'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='cmp_legacy'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='overflow-recov'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='succor'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='ibrs'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='amd-ssbd'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='virt-ssbd'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='lbrv'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='tsc-scale'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='vmcb-clean'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='flushbyasid'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='pause-filter'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='pfthreshold'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='svme-addr-chk'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='lfence-always-serializing'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='xsaves'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='svm'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='topoext'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='npt'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='nrip-save'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <clock offset='utc'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <timer name='pit' tickpolicy='delay'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <timer name='rtc' tickpolicy='catchup'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <timer name='hpet' present='no'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </clock>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <on_poweroff>destroy</on_poweroff>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <on_reboot>restart</on_reboot>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <on_crash>destroy</on_crash>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <disk type='file' device='disk'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <driver name='qemu' type='qcow2' cache='none'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <source file='/var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk' index='2'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <backingStore type='file' index='3'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:         <format type='raw'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:         <source file='/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:         <backingStore/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       </backingStore>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target dev='vda' bus='virtio'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='virtio-disk0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <disk type='file' device='cdrom'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <driver name='qemu' type='raw' cache='none'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <source file='/var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.config' index='1'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <backingStore/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target dev='sda' bus='sata'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <readonly/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='sata0-0-0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='0' model='pcie-root'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pcie.0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='1' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='1' port='0x10'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.1'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='2' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='2' port='0x11'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.2'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='3' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='3' port='0x12'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.3'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='4' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='4' port='0x13'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.4'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='5' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='5' port='0x14'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.5'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='6' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='6' port='0x15'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.6'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='7' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='7' port='0x16'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.7'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='8' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='8' port='0x17'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.8'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='9' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='9' port='0x18'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.9'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='10' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='10' port='0x19'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.10'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='11' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='11' port='0x1a'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.11'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='12' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='12' port='0x1b'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.12'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='13' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='13' port='0x1c'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.13'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='14' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='14' port='0x1d'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.14'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='15' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='15' port='0x1e'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.15'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='16' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='16' port='0x1f'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.16'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='17' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='17' port='0x20'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.17'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='18' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='18' port='0x21'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.18'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='19' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='19' port='0x22'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.19'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='20' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='20' port='0x23'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.20'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='21' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='21' port='0x24'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.21'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='22' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='22' port='0x25'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.22'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='23' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='23' port='0x26'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.23'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='24' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='24' port='0x27'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.24'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='25' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='25' port='0x28'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.25'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-pci-bridge'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.26'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='usb' index='0' model='piix3-uhci'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='usb'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='sata' index='0'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='ide'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <interface type='ethernet'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <mac address='fa:16:3e:71:0b:7d'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target dev='tapd586c1db-aa'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model type='virtio'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <driver name='vhost' rx_queue_size='512'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <mtu size='1442'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='net0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <serial type='pty'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <source path='/dev/pts/0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <log file='/var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/console.log' append='off'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target type='isa-serial' port='0'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:         <model name='isa-serial'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       </target>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='serial0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </serial>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <console type='pty' tty='/dev/pts/0'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <source path='/dev/pts/0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <log file='/var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/console.log' append='off'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target type='serial' port='0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='serial0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </console>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <input type='tablet' bus='usb'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='input0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='usb' bus='0' port='1'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </input>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <input type='mouse' bus='ps2'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='input1'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </input>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <input type='keyboard' bus='ps2'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='input2'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </input>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <listen type='address' address='::0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </graphics>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <audio id='1' type='none'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <video>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model type='virtio' heads='1' primary='yes'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='video0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </video>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <watchdog model='itco' action='reset'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='watchdog0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </watchdog>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <memballoon model='virtio'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <stats period='10'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='balloon0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </memballoon>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <rng model='virtio'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <backend model='random'>/dev/urandom</backend>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='rng0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <label>system_u:system_r:svirt_t:s0:c258,c501</label>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c258,c501</imagelabel>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </seclabel>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <label>+107:+107</label>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <imagelabel>+107:+107</imagelabel>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </seclabel>
Feb 17 17:33:55 compute-0 nova_compute[186479]: </domain>
Feb 17 17:33:55 compute-0 nova_compute[186479]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.726 186483 DEBUG nova.virt.libvirt.guest [req-d38a9764-4384-4a06-ad08-2ab78259ab4d req-7cca1091-290e-4b19-bb06-b686dd1f86a4 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:c1:86:a4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6440ba6f-03"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.728 186483 DEBUG nova.virt.libvirt.guest [req-d38a9764-4384-4a06-ad08-2ab78259ab4d req-7cca1091-290e-4b19-bb06-b686dd1f86a4 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:c1:86:a4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6440ba6f-03"/></interface>not found in domain: <domain type='kvm' id='6'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <name>instance-00000006</name>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <uuid>3f9e4572-dddc-48ee-8ecf-d52e23be5e3a</uuid>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <metadata>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <nova:name>tempest-TestNetworkBasicOps-server-2104168838</nova:name>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <nova:creationTime>2026-02-17 17:33:53</nova:creationTime>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <nova:flavor name="m1.nano">
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:memory>128</nova:memory>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:disk>1</nova:disk>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:swap>0</nova:swap>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:vcpus>1</nova:vcpus>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </nova:flavor>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <nova:owner>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </nova:owner>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <nova:ports>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:port uuid="d586c1db-aaaa-45aa-8c55-c3a66e387a6e">
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </nova:port>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </nova:ports>
Feb 17 17:33:55 compute-0 nova_compute[186479]: </nova:instance>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </metadata>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <memory unit='KiB'>131072</memory>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <currentMemory unit='KiB'>131072</currentMemory>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <vcpu placement='static'>1</vcpu>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <resource>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <partition>/machine</partition>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </resource>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <sysinfo type='smbios'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <system>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <entry name='manufacturer'>RDO</entry>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <entry name='product'>OpenStack Compute</entry>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <entry name='version'>27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <entry name='serial'>3f9e4572-dddc-48ee-8ecf-d52e23be5e3a</entry>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <entry name='uuid'>3f9e4572-dddc-48ee-8ecf-d52e23be5e3a</entry>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <entry name='family'>Virtual Machine</entry>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </system>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </sysinfo>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <os>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <boot dev='hd'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <smbios mode='sysinfo'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </os>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <features>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <acpi/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <apic/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <vmcoreinfo state='on'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </features>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <cpu mode='custom' match='exact' check='full'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <model fallback='forbid'>EPYC-Rome</model>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <vendor>AMD</vendor>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='x2apic'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='tsc-deadline'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='hypervisor'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='tsc_adjust'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='spec-ctrl'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='stibp'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='ssbd'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='cmp_legacy'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='overflow-recov'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='succor'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='ibrs'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='amd-ssbd'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='virt-ssbd'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='lbrv'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='tsc-scale'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='vmcb-clean'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='flushbyasid'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='pause-filter'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='pfthreshold'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='svme-addr-chk'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='lfence-always-serializing'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='xsaves'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='svm'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='require' name='topoext'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='npt'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <feature policy='disable' name='nrip-save'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <clock offset='utc'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <timer name='pit' tickpolicy='delay'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <timer name='rtc' tickpolicy='catchup'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <timer name='hpet' present='no'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </clock>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <on_poweroff>destroy</on_poweroff>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <on_reboot>restart</on_reboot>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <on_crash>destroy</on_crash>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <disk type='file' device='disk'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <driver name='qemu' type='qcow2' cache='none'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <source file='/var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk' index='2'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <backingStore type='file' index='3'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:         <format type='raw'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:         <source file='/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:         <backingStore/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       </backingStore>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target dev='vda' bus='virtio'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='virtio-disk0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <disk type='file' device='cdrom'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <driver name='qemu' type='raw' cache='none'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <source file='/var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/disk.config' index='1'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <backingStore/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target dev='sda' bus='sata'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <readonly/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='sata0-0-0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='0' model='pcie-root'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pcie.0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='1' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='1' port='0x10'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.1'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='2' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='2' port='0x11'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.2'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='3' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='3' port='0x12'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.3'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='4' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='4' port='0x13'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.4'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='5' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='5' port='0x14'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.5'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='6' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='6' port='0x15'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.6'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='7' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='7' port='0x16'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.7'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='8' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='8' port='0x17'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.8'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='9' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='9' port='0x18'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.9'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='10' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='10' port='0x19'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.10'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='11' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='11' port='0x1a'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.11'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='12' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='12' port='0x1b'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.12'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='13' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='13' port='0x1c'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.13'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='14' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='14' port='0x1d'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.14'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='15' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='15' port='0x1e'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.15'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='16' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='16' port='0x1f'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.16'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='17' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='17' port='0x20'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.17'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='18' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='18' port='0x21'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.18'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='19' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='19' port='0x22'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.19'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='20' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='20' port='0x23'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.20'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='21' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='21' port='0x24'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.21'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='22' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='22' port='0x25'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.22'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='23' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='23' port='0x26'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.23'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='24' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='24' port='0x27'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.24'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='25' model='pcie-root-port'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-root-port'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target chassis='25' port='0x28'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.25'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model name='pcie-pci-bridge'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='pci.26'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='usb' index='0' model='piix3-uhci'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='usb'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <controller type='sata' index='0'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='ide'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </controller>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <interface type='ethernet'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <mac address='fa:16:3e:71:0b:7d'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target dev='tapd586c1db-aa'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model type='virtio'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <driver name='vhost' rx_queue_size='512'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <mtu size='1442'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='net0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <serial type='pty'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <source path='/dev/pts/0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <log file='/var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/console.log' append='off'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target type='isa-serial' port='0'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:         <model name='isa-serial'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       </target>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='serial0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </serial>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <console type='pty' tty='/dev/pts/0'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <source path='/dev/pts/0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <log file='/var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a/console.log' append='off'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <target type='serial' port='0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='serial0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </console>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <input type='tablet' bus='usb'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='input0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='usb' bus='0' port='1'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </input>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <input type='mouse' bus='ps2'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='input1'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </input>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <input type='keyboard' bus='ps2'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='input2'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </input>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <listen type='address' address='::0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </graphics>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <audio id='1' type='none'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <video>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <model type='virtio' heads='1' primary='yes'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='video0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </video>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <watchdog model='itco' action='reset'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='watchdog0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </watchdog>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <memballoon model='virtio'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <stats period='10'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='balloon0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </memballoon>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <rng model='virtio'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <backend model='random'>/dev/urandom</backend>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <alias name='rng0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <label>system_u:system_r:svirt_t:s0:c258,c501</label>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c258,c501</imagelabel>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </seclabel>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <label>+107:+107</label>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <imagelabel>+107:+107</imagelabel>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </seclabel>
Feb 17 17:33:55 compute-0 nova_compute[186479]: </domain>
Feb 17 17:33:55 compute-0 nova_compute[186479]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.728 186483 WARNING nova.virt.libvirt.driver [req-d38a9764-4384-4a06-ad08-2ab78259ab4d req-7cca1091-290e-4b19-bb06-b686dd1f86a4 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Detaching interface fa:16:3e:c1:86:a4 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap6440ba6f-03' not found.
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.729 186483 DEBUG nova.virt.libvirt.vif [req-d38a9764-4384-4a06-ad08-2ab78259ab4d req-7cca1091-290e-4b19-bb06-b686dd1f86a4 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:32:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2104168838',display_name='tempest-TestNetworkBasicOps-server-2104168838',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2104168838',id=6,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEGSEk4tvtHx+1wxTUolBp5FEKwhcwrFkbyGug9B2dJQk+Gxis2F5tzoHoKU+6EzBSJMG7knNQ1i0UckkdjQ1ANkfH3T0OzsYfzd7j6wv5dyVb9QJBpDJuL+Kt9oifXUkw==',key_name='tempest-TestNetworkBasicOps-347025944',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:32:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-vfuyzqzc',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:32:42Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=3f9e4572-dddc-48ee-8ecf-d52e23be5e3a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "address": "fa:16:3e:c1:86:a4", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6440ba6f-03", "ovs_interfaceid": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.729 186483 DEBUG nova.network.os_vif_util [req-d38a9764-4384-4a06-ad08-2ab78259ab4d req-7cca1091-290e-4b19-bb06-b686dd1f86a4 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Converting VIF {"id": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "address": "fa:16:3e:c1:86:a4", "network": {"id": "0a9ae613-f3e7-4402-af22-4077e6ce992d", "bridge": "br-int", "label": "tempest-network-smoke--1552166111", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.28", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6440ba6f-03", "ovs_interfaceid": "6440ba6f-03ca-4cca-9d42-4f3232f1cbaf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.730 186483 DEBUG nova.network.os_vif_util [req-d38a9764-4384-4a06-ad08-2ab78259ab4d req-7cca1091-290e-4b19-bb06-b686dd1f86a4 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c1:86:a4,bridge_name='br-int',has_traffic_filtering=True,id=6440ba6f-03ca-4cca-9d42-4f3232f1cbaf,network=Network(0a9ae613-f3e7-4402-af22-4077e6ce992d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6440ba6f-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.730 186483 DEBUG os_vif [req-d38a9764-4384-4a06-ad08-2ab78259ab4d req-7cca1091-290e-4b19-bb06-b686dd1f86a4 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:86:a4,bridge_name='br-int',has_traffic_filtering=True,id=6440ba6f-03ca-4cca-9d42-4f3232f1cbaf,network=Network(0a9ae613-f3e7-4402-af22-4077e6ce992d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6440ba6f-03') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.731 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.732 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6440ba6f-03, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.732 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.733 186483 INFO os_vif [req-d38a9764-4384-4a06-ad08-2ab78259ab4d req-7cca1091-290e-4b19-bb06-b686dd1f86a4 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:86:a4,bridge_name='br-int',has_traffic_filtering=True,id=6440ba6f-03ca-4cca-9d42-4f3232f1cbaf,network=Network(0a9ae613-f3e7-4402-af22-4077e6ce992d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6440ba6f-03')
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.734 186483 DEBUG nova.virt.libvirt.guest [req-d38a9764-4384-4a06-ad08-2ab78259ab4d req-7cca1091-290e-4b19-bb06-b686dd1f86a4 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <nova:name>tempest-TestNetworkBasicOps-server-2104168838</nova:name>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <nova:creationTime>2026-02-17 17:33:55</nova:creationTime>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <nova:flavor name="m1.nano">
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:memory>128</nova:memory>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:disk>1</nova:disk>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:swap>0</nova:swap>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:vcpus>1</nova:vcpus>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </nova:flavor>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <nova:owner>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </nova:owner>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   <nova:ports>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     <nova:port uuid="d586c1db-aaaa-45aa-8c55-c3a66e387a6e">
Feb 17 17:33:55 compute-0 nova_compute[186479]:       <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 17 17:33:55 compute-0 nova_compute[186479]:     </nova:port>
Feb 17 17:33:55 compute-0 nova_compute[186479]:   </nova:ports>
Feb 17 17:33:55 compute-0 nova_compute[186479]: </nova:instance>
Feb 17 17:33:55 compute-0 nova_compute[186479]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Feb 17 17:33:55 compute-0 podman[218184]: 2026-02-17 17:33:55.739752433 +0000 UTC m=+0.068127369 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Feb 17 17:33:55 compute-0 ovn_controller[96568]: 2026-02-17T17:33:55Z|00113|binding|INFO|Releasing lport 4efe1491-cdee-41bd-a179-ae1b240d0928 from this chassis (sb_readonly=0)
Feb 17 17:33:55 compute-0 nova_compute[186479]: 2026-02-17 17:33:55.941 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.201 186483 INFO nova.network.neutron [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Port 6440ba6f-03ca-4cca-9d42-4f3232f1cbaf from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.202 186483 DEBUG nova.network.neutron [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Updating instance_info_cache with network_info: [{"id": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "address": "fa:16:3e:71:0b:7d", "network": {"id": "4bf50377-716d-42af-ab2c-e962c79e0a2f", "bridge": "br-int", "label": "tempest-network-smoke--1389548266", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd586c1db-aa", "ovs_interfaceid": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.224 186483 DEBUG oslo_concurrency.lockutils [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Releasing lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.252 186483 DEBUG oslo_concurrency.lockutils [None req-998294e9-cae0-48b7-8f7f-e8d74ba05ed6 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "interface-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-6440ba6f-03ca-4cca-9d42-4f3232f1cbaf" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.319 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.320 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.336 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.454 186483 DEBUG oslo_concurrency.lockutils [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.455 186483 DEBUG oslo_concurrency.lockutils [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.455 186483 DEBUG oslo_concurrency.lockutils [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.456 186483 DEBUG oslo_concurrency.lockutils [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.456 186483 DEBUG oslo_concurrency.lockutils [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.458 186483 INFO nova.compute.manager [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Terminating instance
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.459 186483 DEBUG nova.compute.manager [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 17 17:33:56 compute-0 kernel: tapd586c1db-aa (unregistering): left promiscuous mode
Feb 17 17:33:56 compute-0 NetworkManager[56323]: <info>  [1771349636.4838] device (tapd586c1db-aa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.483 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.492 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:56 compute-0 ovn_controller[96568]: 2026-02-17T17:33:56Z|00114|binding|INFO|Releasing lport d586c1db-aaaa-45aa-8c55-c3a66e387a6e from this chassis (sb_readonly=0)
Feb 17 17:33:56 compute-0 ovn_controller[96568]: 2026-02-17T17:33:56Z|00115|binding|INFO|Setting lport d586c1db-aaaa-45aa-8c55-c3a66e387a6e down in Southbound
Feb 17 17:33:56 compute-0 ovn_controller[96568]: 2026-02-17T17:33:56Z|00116|binding|INFO|Removing iface tapd586c1db-aa ovn-installed in OVS
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.496 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:56.500 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:0b:7d 10.100.0.10'], port_security=['fa:16:3e:71:0b:7d 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '3f9e4572-dddc-48ee-8ecf-d52e23be5e3a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4bf50377-716d-42af-ab2c-e962c79e0a2f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4c2aaec3-a759-4c1d-aa84-8bc6ccd10feb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e1d0dffc-0ac5-44da-8066-c0d43978c220, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=d586c1db-aaaa-45aa-8c55-c3a66e387a6e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:33:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:56.502 105898 INFO neutron.agent.ovn.metadata.agent [-] Port d586c1db-aaaa-45aa-8c55-c3a66e387a6e in datapath 4bf50377-716d-42af-ab2c-e962c79e0a2f unbound from our chassis
Feb 17 17:33:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:56.503 105898 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4bf50377-716d-42af-ab2c-e962c79e0a2f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.504 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:56.504 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[523058dc-84cc-4e85-b3d6-2b5cda638078]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:56.506 105898 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f namespace which is not needed anymore
Feb 17 17:33:56 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Deactivated successfully.
Feb 17 17:33:56 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Consumed 15.107s CPU time.
Feb 17 17:33:56 compute-0 systemd-machined[155877]: Machine qemu-6-instance-00000006 terminated.
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.602 186483 DEBUG nova.compute.manager [req-753bcc75-d5f1-4783-a452-bc3a2a02cecb req-2aa78cde-7963-452d-83f1-993924b55516 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Received event network-vif-plugged-6440ba6f-03ca-4cca-9d42-4f3232f1cbaf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.602 186483 DEBUG oslo_concurrency.lockutils [req-753bcc75-d5f1-4783-a452-bc3a2a02cecb req-2aa78cde-7963-452d-83f1-993924b55516 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.602 186483 DEBUG oslo_concurrency.lockutils [req-753bcc75-d5f1-4783-a452-bc3a2a02cecb req-2aa78cde-7963-452d-83f1-993924b55516 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.603 186483 DEBUG oslo_concurrency.lockutils [req-753bcc75-d5f1-4783-a452-bc3a2a02cecb req-2aa78cde-7963-452d-83f1-993924b55516 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.603 186483 DEBUG nova.compute.manager [req-753bcc75-d5f1-4783-a452-bc3a2a02cecb req-2aa78cde-7963-452d-83f1-993924b55516 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] No waiting events found dispatching network-vif-plugged-6440ba6f-03ca-4cca-9d42-4f3232f1cbaf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.603 186483 WARNING nova.compute.manager [req-753bcc75-d5f1-4783-a452-bc3a2a02cecb req-2aa78cde-7963-452d-83f1-993924b55516 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Received unexpected event network-vif-plugged-6440ba6f-03ca-4cca-9d42-4f3232f1cbaf for instance with vm_state active and task_state deleting.
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.603 186483 DEBUG nova.compute.manager [req-753bcc75-d5f1-4783-a452-bc3a2a02cecb req-2aa78cde-7963-452d-83f1-993924b55516 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Received event network-changed-d586c1db-aaaa-45aa-8c55-c3a66e387a6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.603 186483 DEBUG nova.compute.manager [req-753bcc75-d5f1-4783-a452-bc3a2a02cecb req-2aa78cde-7963-452d-83f1-993924b55516 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Refreshing instance network info cache due to event network-changed-d586c1db-aaaa-45aa-8c55-c3a66e387a6e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.604 186483 DEBUG oslo_concurrency.lockutils [req-753bcc75-d5f1-4783-a452-bc3a2a02cecb req-2aa78cde-7963-452d-83f1-993924b55516 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.604 186483 DEBUG oslo_concurrency.lockutils [req-753bcc75-d5f1-4783-a452-bc3a2a02cecb req-2aa78cde-7963-452d-83f1-993924b55516 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.604 186483 DEBUG nova.network.neutron [req-753bcc75-d5f1-4783-a452-bc3a2a02cecb req-2aa78cde-7963-452d-83f1-993924b55516 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Refreshing network info cache for port d586c1db-aaaa-45aa-8c55-c3a66e387a6e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:33:56 compute-0 neutron-haproxy-ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f[217573]: [NOTICE]   (217577) : haproxy version is 2.8.14-c23fe91
Feb 17 17:33:56 compute-0 neutron-haproxy-ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f[217573]: [NOTICE]   (217577) : path to executable is /usr/sbin/haproxy
Feb 17 17:33:56 compute-0 neutron-haproxy-ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f[217573]: [WARNING]  (217577) : Exiting Master process...
Feb 17 17:33:56 compute-0 neutron-haproxy-ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f[217573]: [WARNING]  (217577) : Exiting Master process...
Feb 17 17:33:56 compute-0 neutron-haproxy-ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f[217573]: [ALERT]    (217577) : Current worker (217579) exited with code 143 (Terminated)
Feb 17 17:33:56 compute-0 neutron-haproxy-ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f[217573]: [WARNING]  (217577) : All workers exited. Exiting... (0)
Feb 17 17:33:56 compute-0 systemd[1]: libpod-d0133fbf82b0a247d1145430d220a30e8183cdbd971fff86c74b951abe83c055.scope: Deactivated successfully.
Feb 17 17:33:56 compute-0 podman[218249]: 2026-02-17 17:33:56.658966817 +0000 UTC m=+0.048135236 container died d0133fbf82b0a247d1145430d220a30e8183cdbd971fff86c74b951abe83c055 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.689 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d0133fbf82b0a247d1145430d220a30e8183cdbd971fff86c74b951abe83c055-userdata-shm.mount: Deactivated successfully.
Feb 17 17:33:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-79767c4502347ad9025f7b377435d40f49f2296dd68c485cc8368fd78cb9d354-merged.mount: Deactivated successfully.
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.695 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:56 compute-0 podman[218249]: 2026-02-17 17:33:56.70169506 +0000 UTC m=+0.090863469 container cleanup d0133fbf82b0a247d1145430d220a30e8183cdbd971fff86c74b951abe83c055 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:33:56 compute-0 systemd[1]: libpod-conmon-d0133fbf82b0a247d1145430d220a30e8183cdbd971fff86c74b951abe83c055.scope: Deactivated successfully.
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.720 186483 INFO nova.virt.libvirt.driver [-] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Instance destroyed successfully.
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.720 186483 DEBUG nova.objects.instance [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'resources' on Instance uuid 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.733 186483 DEBUG nova.virt.libvirt.vif [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:32:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2104168838',display_name='tempest-TestNetworkBasicOps-server-2104168838',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2104168838',id=6,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEGSEk4tvtHx+1wxTUolBp5FEKwhcwrFkbyGug9B2dJQk+Gxis2F5tzoHoKU+6EzBSJMG7knNQ1i0UckkdjQ1ANkfH3T0OzsYfzd7j6wv5dyVb9QJBpDJuL+Kt9oifXUkw==',key_name='tempest-TestNetworkBasicOps-347025944',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:32:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-vfuyzqzc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:32:42Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=3f9e4572-dddc-48ee-8ecf-d52e23be5e3a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "address": "fa:16:3e:71:0b:7d", "network": {"id": "4bf50377-716d-42af-ab2c-e962c79e0a2f", "bridge": "br-int", "label": "tempest-network-smoke--1389548266", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd586c1db-aa", "ovs_interfaceid": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.734 186483 DEBUG nova.network.os_vif_util [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "address": "fa:16:3e:71:0b:7d", "network": {"id": "4bf50377-716d-42af-ab2c-e962c79e0a2f", "bridge": "br-int", "label": "tempest-network-smoke--1389548266", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd586c1db-aa", "ovs_interfaceid": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.734 186483 DEBUG nova.network.os_vif_util [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:71:0b:7d,bridge_name='br-int',has_traffic_filtering=True,id=d586c1db-aaaa-45aa-8c55-c3a66e387a6e,network=Network(4bf50377-716d-42af-ab2c-e962c79e0a2f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd586c1db-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.735 186483 DEBUG os_vif [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:71:0b:7d,bridge_name='br-int',has_traffic_filtering=True,id=d586c1db-aaaa-45aa-8c55-c3a66e387a6e,network=Network(4bf50377-716d-42af-ab2c-e962c79e0a2f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd586c1db-aa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.736 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.736 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd586c1db-aa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.737 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.739 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.741 186483 INFO os_vif [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:71:0b:7d,bridge_name='br-int',has_traffic_filtering=True,id=d586c1db-aaaa-45aa-8c55-c3a66e387a6e,network=Network(4bf50377-716d-42af-ab2c-e962c79e0a2f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd586c1db-aa')
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.741 186483 INFO nova.virt.libvirt.driver [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Deleting instance files /var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a_del
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.741 186483 INFO nova.virt.libvirt.driver [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Deletion of /var/lib/nova/instances/3f9e4572-dddc-48ee-8ecf-d52e23be5e3a_del complete
Feb 17 17:33:56 compute-0 podman[218292]: 2026-02-17 17:33:56.75692725 +0000 UTC m=+0.034594693 container remove d0133fbf82b0a247d1145430d220a30e8183cdbd971fff86c74b951abe83c055 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 17 17:33:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:56.759 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[14cb4696-33ba-4f1e-ae3b-bd5fc034aca2]: (4, ('Tue Feb 17 05:33:56 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f (d0133fbf82b0a247d1145430d220a30e8183cdbd971fff86c74b951abe83c055)\nd0133fbf82b0a247d1145430d220a30e8183cdbd971fff86c74b951abe83c055\nTue Feb 17 05:33:56 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f (d0133fbf82b0a247d1145430d220a30e8183cdbd971fff86c74b951abe83c055)\nd0133fbf82b0a247d1145430d220a30e8183cdbd971fff86c74b951abe83c055\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:56.761 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[9c420140-9fc5-4742-b616-c954a0066f1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:56.762 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4bf50377-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.763 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:56 compute-0 kernel: tap4bf50377-70: left promiscuous mode
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.768 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:33:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:56.770 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[ac42e64b-1070-4d00-b438-b23c8cb0e550]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:56.785 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[e425b33b-2282-4b77-a446-be853e442c7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:56.786 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[42ab9e5f-b816-4161-a96c-b58878e471c4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.786 186483 INFO nova.compute.manager [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Took 0.33 seconds to destroy the instance on the hypervisor.
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.787 186483 DEBUG oslo.service.loopingcall [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.788 186483 DEBUG nova.compute.manager [-] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 17 17:33:56 compute-0 nova_compute[186479]: 2026-02-17 17:33:56.788 186483 DEBUG nova.network.neutron [-] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 17 17:33:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:56.799 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[62f5a158-d531-4a14-bf9a-28af49cf5091]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 331704, 'reachable_time': 19876, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218308, 'error': None, 'target': 'ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:56.801 106424 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4bf50377-716d-42af-ab2c-e962c79e0a2f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 17 17:33:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:33:56.801 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[fc6743c5-1745-4775-b617-303ade04f250]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:33:56 compute-0 systemd[1]: run-netns-ovnmeta\x2d4bf50377\x2d716d\x2d42af\x2dab2c\x2de962c79e0a2f.mount: Deactivated successfully.
Feb 17 17:33:57 compute-0 nova_compute[186479]: 2026-02-17 17:33:57.784 186483 DEBUG nova.network.neutron [-] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:33:57 compute-0 nova_compute[186479]: 2026-02-17 17:33:57.800 186483 INFO nova.compute.manager [-] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Took 1.01 seconds to deallocate network for instance.
Feb 17 17:33:57 compute-0 nova_compute[186479]: 2026-02-17 17:33:57.845 186483 DEBUG oslo_concurrency.lockutils [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:57 compute-0 nova_compute[186479]: 2026-02-17 17:33:57.846 186483 DEBUG oslo_concurrency.lockutils [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:57 compute-0 nova_compute[186479]: 2026-02-17 17:33:57.898 186483 DEBUG nova.compute.provider_tree [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:33:57 compute-0 nova_compute[186479]: 2026-02-17 17:33:57.920 186483 DEBUG nova.scheduler.client.report [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:33:57 compute-0 nova_compute[186479]: 2026-02-17 17:33:57.939 186483 DEBUG oslo_concurrency.lockutils [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:57 compute-0 nova_compute[186479]: 2026-02-17 17:33:57.960 186483 INFO nova.scheduler.client.report [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Deleted allocations for instance 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a
Feb 17 17:33:58 compute-0 nova_compute[186479]: 2026-02-17 17:33:58.019 186483 DEBUG oslo_concurrency.lockutils [None req-c7238cc8-f62c-4ac6-aa89-a2ee70d133a4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.564s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:58 compute-0 nova_compute[186479]: 2026-02-17 17:33:58.216 186483 DEBUG nova.network.neutron [req-753bcc75-d5f1-4783-a452-bc3a2a02cecb req-2aa78cde-7963-452d-83f1-993924b55516 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Updated VIF entry in instance network info cache for port d586c1db-aaaa-45aa-8c55-c3a66e387a6e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:33:58 compute-0 nova_compute[186479]: 2026-02-17 17:33:58.217 186483 DEBUG nova.network.neutron [req-753bcc75-d5f1-4783-a452-bc3a2a02cecb req-2aa78cde-7963-452d-83f1-993924b55516 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Updating instance_info_cache with network_info: [{"id": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "address": "fa:16:3e:71:0b:7d", "network": {"id": "4bf50377-716d-42af-ab2c-e962c79e0a2f", "bridge": "br-int", "label": "tempest-network-smoke--1389548266", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd586c1db-aa", "ovs_interfaceid": "d586c1db-aaaa-45aa-8c55-c3a66e387a6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:33:58 compute-0 nova_compute[186479]: 2026-02-17 17:33:58.237 186483 DEBUG oslo_concurrency.lockutils [req-753bcc75-d5f1-4783-a452-bc3a2a02cecb req-2aa78cde-7963-452d-83f1-993924b55516 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-3f9e4572-dddc-48ee-8ecf-d52e23be5e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:33:58 compute-0 nova_compute[186479]: 2026-02-17 17:33:58.320 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:33:58 compute-0 nova_compute[186479]: 2026-02-17 17:33:58.677 186483 DEBUG nova.compute.manager [req-e4b3a3c1-47bc-4e04-a783-94646bab80f9 req-1719ffc5-b141-4a88-bacb-aa51e054cf2a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Received event network-vif-unplugged-d586c1db-aaaa-45aa-8c55-c3a66e387a6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:33:58 compute-0 nova_compute[186479]: 2026-02-17 17:33:58.677 186483 DEBUG oslo_concurrency.lockutils [req-e4b3a3c1-47bc-4e04-a783-94646bab80f9 req-1719ffc5-b141-4a88-bacb-aa51e054cf2a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:58 compute-0 nova_compute[186479]: 2026-02-17 17:33:58.678 186483 DEBUG oslo_concurrency.lockutils [req-e4b3a3c1-47bc-4e04-a783-94646bab80f9 req-1719ffc5-b141-4a88-bacb-aa51e054cf2a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:58 compute-0 nova_compute[186479]: 2026-02-17 17:33:58.679 186483 DEBUG oslo_concurrency.lockutils [req-e4b3a3c1-47bc-4e04-a783-94646bab80f9 req-1719ffc5-b141-4a88-bacb-aa51e054cf2a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:58 compute-0 nova_compute[186479]: 2026-02-17 17:33:58.680 186483 DEBUG nova.compute.manager [req-e4b3a3c1-47bc-4e04-a783-94646bab80f9 req-1719ffc5-b141-4a88-bacb-aa51e054cf2a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] No waiting events found dispatching network-vif-unplugged-d586c1db-aaaa-45aa-8c55-c3a66e387a6e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:33:58 compute-0 nova_compute[186479]: 2026-02-17 17:33:58.680 186483 WARNING nova.compute.manager [req-e4b3a3c1-47bc-4e04-a783-94646bab80f9 req-1719ffc5-b141-4a88-bacb-aa51e054cf2a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Received unexpected event network-vif-unplugged-d586c1db-aaaa-45aa-8c55-c3a66e387a6e for instance with vm_state deleted and task_state None.
Feb 17 17:33:58 compute-0 nova_compute[186479]: 2026-02-17 17:33:58.680 186483 DEBUG nova.compute.manager [req-e4b3a3c1-47bc-4e04-a783-94646bab80f9 req-1719ffc5-b141-4a88-bacb-aa51e054cf2a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Received event network-vif-plugged-d586c1db-aaaa-45aa-8c55-c3a66e387a6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:33:58 compute-0 nova_compute[186479]: 2026-02-17 17:33:58.681 186483 DEBUG oslo_concurrency.lockutils [req-e4b3a3c1-47bc-4e04-a783-94646bab80f9 req-1719ffc5-b141-4a88-bacb-aa51e054cf2a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:33:58 compute-0 nova_compute[186479]: 2026-02-17 17:33:58.682 186483 DEBUG oslo_concurrency.lockutils [req-e4b3a3c1-47bc-4e04-a783-94646bab80f9 req-1719ffc5-b141-4a88-bacb-aa51e054cf2a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:33:58 compute-0 nova_compute[186479]: 2026-02-17 17:33:58.682 186483 DEBUG oslo_concurrency.lockutils [req-e4b3a3c1-47bc-4e04-a783-94646bab80f9 req-1719ffc5-b141-4a88-bacb-aa51e054cf2a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "3f9e4572-dddc-48ee-8ecf-d52e23be5e3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:33:58 compute-0 nova_compute[186479]: 2026-02-17 17:33:58.683 186483 DEBUG nova.compute.manager [req-e4b3a3c1-47bc-4e04-a783-94646bab80f9 req-1719ffc5-b141-4a88-bacb-aa51e054cf2a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] No waiting events found dispatching network-vif-plugged-d586c1db-aaaa-45aa-8c55-c3a66e387a6e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:33:58 compute-0 nova_compute[186479]: 2026-02-17 17:33:58.683 186483 WARNING nova.compute.manager [req-e4b3a3c1-47bc-4e04-a783-94646bab80f9 req-1719ffc5-b141-4a88-bacb-aa51e054cf2a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Received unexpected event network-vif-plugged-d586c1db-aaaa-45aa-8c55-c3a66e387a6e for instance with vm_state deleted and task_state None.
Feb 17 17:33:58 compute-0 nova_compute[186479]: 2026-02-17 17:33:58.684 186483 DEBUG nova.compute.manager [req-e4b3a3c1-47bc-4e04-a783-94646bab80f9 req-1719ffc5-b141-4a88-bacb-aa51e054cf2a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Received event network-vif-deleted-d586c1db-aaaa-45aa-8c55-c3a66e387a6e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:33:58 compute-0 nova_compute[186479]: 2026-02-17 17:33:58.767 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:00 compute-0 nova_compute[186479]: 2026-02-17 17:34:00.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:34:01 compute-0 nova_compute[186479]: 2026-02-17 17:34:01.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:34:01 compute-0 nova_compute[186479]: 2026-02-17 17:34:01.530 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:01 compute-0 nova_compute[186479]: 2026-02-17 17:34:01.547 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:01 compute-0 podman[218310]: 2026-02-17 17:34:01.706866111 +0000 UTC m=+0.044785354 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 17 17:34:01 compute-0 nova_compute[186479]: 2026-02-17 17:34:01.738 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:02 compute-0 nova_compute[186479]: 2026-02-17 17:34:02.297 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:34:02 compute-0 nova_compute[186479]: 2026-02-17 17:34:02.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:34:02 compute-0 nova_compute[186479]: 2026-02-17 17:34:02.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:34:02 compute-0 nova_compute[186479]: 2026-02-17 17:34:02.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:34:02 compute-0 nova_compute[186479]: 2026-02-17 17:34:02.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:34:02 compute-0 nova_compute[186479]: 2026-02-17 17:34:02.537 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:02 compute-0 nova_compute[186479]: 2026-02-17 17:34:02.537 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:02 compute-0 nova_compute[186479]: 2026-02-17 17:34:02.538 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:02 compute-0 nova_compute[186479]: 2026-02-17 17:34:02.538 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:34:02 compute-0 nova_compute[186479]: 2026-02-17 17:34:02.668 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:34:02 compute-0 nova_compute[186479]: 2026-02-17 17:34:02.669 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5766MB free_disk=73.20706558227539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:34:02 compute-0 nova_compute[186479]: 2026-02-17 17:34:02.669 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:02 compute-0 nova_compute[186479]: 2026-02-17 17:34:02.669 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:02 compute-0 nova_compute[186479]: 2026-02-17 17:34:02.799 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:34:02 compute-0 nova_compute[186479]: 2026-02-17 17:34:02.799 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:34:02 compute-0 nova_compute[186479]: 2026-02-17 17:34:02.872 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:34:02 compute-0 nova_compute[186479]: 2026-02-17 17:34:02.887 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:34:02 compute-0 nova_compute[186479]: 2026-02-17 17:34:02.915 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:34:02 compute-0 nova_compute[186479]: 2026-02-17 17:34:02.915 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:02 compute-0 nova_compute[186479]: 2026-02-17 17:34:02.916 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:34:03 compute-0 nova_compute[186479]: 2026-02-17 17:34:03.768 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:03 compute-0 nova_compute[186479]: 2026-02-17 17:34:03.928 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:34:06 compute-0 nova_compute[186479]: 2026-02-17 17:34:06.298 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:34:06 compute-0 nova_compute[186479]: 2026-02-17 17:34:06.470 186483 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771349631.4688683, 8b2c187c-d396-49a6-bc6b-a784ab66a468 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:34:06 compute-0 nova_compute[186479]: 2026-02-17 17:34:06.470 186483 INFO nova.compute.manager [-] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] VM Stopped (Lifecycle Event)
Feb 17 17:34:06 compute-0 nova_compute[186479]: 2026-02-17 17:34:06.494 186483 DEBUG nova.compute.manager [None req-ba1d9566-fa40-4611-9e2c-e971d590c974 - - - - - -] [instance: 8b2c187c-d396-49a6-bc6b-a784ab66a468] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:34:06 compute-0 nova_compute[186479]: 2026-02-17 17:34:06.743 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:07 compute-0 nova_compute[186479]: 2026-02-17 17:34:07.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:34:07 compute-0 nova_compute[186479]: 2026-02-17 17:34:07.303 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:34:07 compute-0 nova_compute[186479]: 2026-02-17 17:34:07.303 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 17 17:34:07 compute-0 nova_compute[186479]: 2026-02-17 17:34:07.322 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 17 17:34:08 compute-0 nova_compute[186479]: 2026-02-17 17:34:08.769 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:09 compute-0 podman[218336]: 2026-02-17 17:34:09.727797978 +0000 UTC m=+0.073336528 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 17 17:34:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:10.953 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:10.953 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:10.953 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:11 compute-0 nova_compute[186479]: 2026-02-17 17:34:11.718 186483 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771349636.7180126, 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:34:11 compute-0 nova_compute[186479]: 2026-02-17 17:34:11.719 186483 INFO nova.compute.manager [-] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] VM Stopped (Lifecycle Event)
Feb 17 17:34:11 compute-0 nova_compute[186479]: 2026-02-17 17:34:11.741 186483 DEBUG nova.compute.manager [None req-33de3951-81a4-4185-9e65-ba4b2d233610 - - - - - -] [instance: 3f9e4572-dddc-48ee-8ecf-d52e23be5e3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:34:11 compute-0 nova_compute[186479]: 2026-02-17 17:34:11.747 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:14 compute-0 nova_compute[186479]: 2026-02-17 17:34:14.883 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:14 compute-0 podman[218363]: 2026-02-17 17:34:14.900392425 +0000 UTC m=+1.247690957 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 17 17:34:16 compute-0 nova_compute[186479]: 2026-02-17 17:34:16.750 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:19 compute-0 podman[218387]: 2026-02-17 17:34:19.726212698 +0000 UTC m=+0.057445216 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public)
Feb 17 17:34:19 compute-0 nova_compute[186479]: 2026-02-17 17:34:19.922 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:21 compute-0 nova_compute[186479]: 2026-02-17 17:34:21.753 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.110 186483 DEBUG oslo_concurrency.lockutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "5b203313-4460-4f2b-b0d0-29e1846079ed" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.111 186483 DEBUG oslo_concurrency.lockutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "5b203313-4460-4f2b-b0d0-29e1846079ed" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.129 186483 DEBUG nova.compute.manager [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.256 186483 DEBUG oslo_concurrency.lockutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.257 186483 DEBUG oslo_concurrency.lockutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.265 186483 DEBUG nova.virt.hardware [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.265 186483 INFO nova.compute.claims [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Claim successful on node compute-0.ctlplane.example.com
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.369 186483 DEBUG nova.scheduler.client.report [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Refreshing inventories for resource provider c9b7a021-c13f-4158-9f46-47cefef2fece _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.392 186483 DEBUG nova.scheduler.client.report [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Updating ProviderTree inventory for provider c9b7a021-c13f-4158-9f46-47cefef2fece from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.393 186483 DEBUG nova.compute.provider_tree [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Updating inventory in ProviderTree for provider c9b7a021-c13f-4158-9f46-47cefef2fece with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.409 186483 DEBUG nova.scheduler.client.report [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Refreshing aggregate associations for resource provider c9b7a021-c13f-4158-9f46-47cefef2fece, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.431 186483 DEBUG nova.scheduler.client.report [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Refreshing trait associations for resource provider c9b7a021-c13f-4158-9f46-47cefef2fece, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SHA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_BMI,HW_CPU_X86_ABM,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_BMI2,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NODE,HW_CPU_X86_SVM,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,HW_CPU_X86_AVX,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_FMA3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_RESCUE_BFV,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE42,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.468 186483 DEBUG nova.compute.provider_tree [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.489 186483 DEBUG nova.scheduler.client.report [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.514 186483 DEBUG oslo_concurrency.lockutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.257s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.515 186483 DEBUG nova.compute.manager [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.568 186483 DEBUG nova.compute.manager [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.569 186483 DEBUG nova.network.neutron [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.592 186483 INFO nova.virt.libvirt.driver [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.609 186483 DEBUG nova.compute.manager [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.696 186483 DEBUG nova.compute.manager [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.697 186483 DEBUG nova.virt.libvirt.driver [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.698 186483 INFO nova.virt.libvirt.driver [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Creating image(s)
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.698 186483 DEBUG oslo_concurrency.lockutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "/var/lib/nova/instances/5b203313-4460-4f2b-b0d0-29e1846079ed/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.699 186483 DEBUG oslo_concurrency.lockutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/5b203313-4460-4f2b-b0d0-29e1846079ed/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.699 186483 DEBUG oslo_concurrency.lockutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/5b203313-4460-4f2b-b0d0-29e1846079ed/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.713 186483 DEBUG oslo_concurrency.processutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.795 186483 DEBUG oslo_concurrency.processutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.796 186483 DEBUG oslo_concurrency.lockutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.797 186483 DEBUG oslo_concurrency.lockutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.809 186483 DEBUG oslo_concurrency.processutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.823 186483 DEBUG nova.policy [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.866 186483 DEBUG oslo_concurrency.processutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.867 186483 DEBUG oslo_concurrency.processutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/5b203313-4460-4f2b-b0d0-29e1846079ed/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.914 186483 DEBUG oslo_concurrency.processutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/5b203313-4460-4f2b-b0d0-29e1846079ed/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.915 186483 DEBUG oslo_concurrency.lockutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.915 186483 DEBUG oslo_concurrency.processutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.963 186483 DEBUG oslo_concurrency.processutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.964 186483 DEBUG nova.virt.disk.api [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Checking if we can resize image /var/lib/nova/instances/5b203313-4460-4f2b-b0d0-29e1846079ed/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 17 17:34:23 compute-0 nova_compute[186479]: 2026-02-17 17:34:23.964 186483 DEBUG oslo_concurrency.processutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b203313-4460-4f2b-b0d0-29e1846079ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:34:24 compute-0 nova_compute[186479]: 2026-02-17 17:34:24.006 186483 DEBUG oslo_concurrency.processutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b203313-4460-4f2b-b0d0-29e1846079ed/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:34:24 compute-0 nova_compute[186479]: 2026-02-17 17:34:24.007 186483 DEBUG nova.virt.disk.api [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Cannot resize image /var/lib/nova/instances/5b203313-4460-4f2b-b0d0-29e1846079ed/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 17 17:34:24 compute-0 nova_compute[186479]: 2026-02-17 17:34:24.007 186483 DEBUG nova.objects.instance [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'migration_context' on Instance uuid 5b203313-4460-4f2b-b0d0-29e1846079ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:34:24 compute-0 nova_compute[186479]: 2026-02-17 17:34:24.024 186483 DEBUG nova.virt.libvirt.driver [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 17 17:34:24 compute-0 nova_compute[186479]: 2026-02-17 17:34:24.025 186483 DEBUG nova.virt.libvirt.driver [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Ensure instance console log exists: /var/lib/nova/instances/5b203313-4460-4f2b-b0d0-29e1846079ed/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 17 17:34:24 compute-0 nova_compute[186479]: 2026-02-17 17:34:24.025 186483 DEBUG oslo_concurrency.lockutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:24 compute-0 nova_compute[186479]: 2026-02-17 17:34:24.026 186483 DEBUG oslo_concurrency.lockutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:24 compute-0 nova_compute[186479]: 2026-02-17 17:34:24.026 186483 DEBUG oslo_concurrency.lockutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:24 compute-0 nova_compute[186479]: 2026-02-17 17:34:24.923 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:25 compute-0 nova_compute[186479]: 2026-02-17 17:34:25.401 186483 DEBUG nova.network.neutron [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Successfully updated port: 278d4f1d-34a9-436c-a8c6-104778e90a0f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 17 17:34:25 compute-0 nova_compute[186479]: 2026-02-17 17:34:25.419 186483 DEBUG oslo_concurrency.lockutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "refresh_cache-5b203313-4460-4f2b-b0d0-29e1846079ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:34:25 compute-0 nova_compute[186479]: 2026-02-17 17:34:25.420 186483 DEBUG oslo_concurrency.lockutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquired lock "refresh_cache-5b203313-4460-4f2b-b0d0-29e1846079ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:34:25 compute-0 nova_compute[186479]: 2026-02-17 17:34:25.420 186483 DEBUG nova.network.neutron [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 17 17:34:25 compute-0 nova_compute[186479]: 2026-02-17 17:34:25.581 186483 DEBUG nova.compute.manager [req-13b89fdd-b79a-4f85-bcbb-cb38577250eb req-13832795-9d22-4e5b-99c9-0c205b85e044 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Received event network-changed-278d4f1d-34a9-436c-a8c6-104778e90a0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:34:25 compute-0 nova_compute[186479]: 2026-02-17 17:34:25.582 186483 DEBUG nova.compute.manager [req-13b89fdd-b79a-4f85-bcbb-cb38577250eb req-13832795-9d22-4e5b-99c9-0c205b85e044 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Refreshing instance network info cache due to event network-changed-278d4f1d-34a9-436c-a8c6-104778e90a0f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:34:25 compute-0 nova_compute[186479]: 2026-02-17 17:34:25.582 186483 DEBUG oslo_concurrency.lockutils [req-13b89fdd-b79a-4f85-bcbb-cb38577250eb req-13832795-9d22-4e5b-99c9-0c205b85e044 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-5b203313-4460-4f2b-b0d0-29e1846079ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:34:26 compute-0 nova_compute[186479]: 2026-02-17 17:34:26.357 186483 DEBUG nova.network.neutron [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 17 17:34:26 compute-0 podman[218423]: 2026-02-17 17:34:26.701390751 +0000 UTC m=+0.048241310 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent)
Feb 17 17:34:26 compute-0 podman[218424]: 2026-02-17 17:34:26.73787587 +0000 UTC m=+0.076649400 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Feb 17 17:34:26 compute-0 nova_compute[186479]: 2026-02-17 17:34:26.757 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.706 186483 DEBUG nova.network.neutron [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Updating instance_info_cache with network_info: [{"id": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "address": "fa:16:3e:c9:d0:d4", "network": {"id": "1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0", "bridge": "br-int", "label": "tempest-network-smoke--1447073493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap278d4f1d-34", "ovs_interfaceid": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.726 186483 DEBUG oslo_concurrency.lockutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Releasing lock "refresh_cache-5b203313-4460-4f2b-b0d0-29e1846079ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.726 186483 DEBUG nova.compute.manager [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Instance network_info: |[{"id": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "address": "fa:16:3e:c9:d0:d4", "network": {"id": "1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0", "bridge": "br-int", "label": "tempest-network-smoke--1447073493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap278d4f1d-34", "ovs_interfaceid": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.727 186483 DEBUG oslo_concurrency.lockutils [req-13b89fdd-b79a-4f85-bcbb-cb38577250eb req-13832795-9d22-4e5b-99c9-0c205b85e044 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-5b203313-4460-4f2b-b0d0-29e1846079ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.727 186483 DEBUG nova.network.neutron [req-13b89fdd-b79a-4f85-bcbb-cb38577250eb req-13832795-9d22-4e5b-99c9-0c205b85e044 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Refreshing network info cache for port 278d4f1d-34a9-436c-a8c6-104778e90a0f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.732 186483 DEBUG nova.virt.libvirt.driver [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Start _get_guest_xml network_info=[{"id": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "address": "fa:16:3e:c9:d0:d4", "network": {"id": "1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0", "bridge": "br-int", "label": "tempest-network-smoke--1447073493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap278d4f1d-34", "ovs_interfaceid": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_format': None, 'encryption_options': None, 'size': 0, 'encrypted': False, 'image_id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.740 186483 WARNING nova.virt.libvirt.driver [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.745 186483 DEBUG nova.virt.libvirt.host [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.746 186483 DEBUG nova.virt.libvirt.host [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.756 186483 DEBUG nova.virt.libvirt.host [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.757 186483 DEBUG nova.virt.libvirt.host [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.758 186483 DEBUG nova.virt.libvirt.driver [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.758 186483 DEBUG nova.virt.hardware [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-17T17:28:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='5ebd41ec-8360-4181-bce3-0c0dc586cdb2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.759 186483 DEBUG nova.virt.hardware [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.759 186483 DEBUG nova.virt.hardware [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.760 186483 DEBUG nova.virt.hardware [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.760 186483 DEBUG nova.virt.hardware [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.761 186483 DEBUG nova.virt.hardware [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.761 186483 DEBUG nova.virt.hardware [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.762 186483 DEBUG nova.virt.hardware [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.762 186483 DEBUG nova.virt.hardware [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.763 186483 DEBUG nova.virt.hardware [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.763 186483 DEBUG nova.virt.hardware [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.769 186483 DEBUG nova.virt.libvirt.vif [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:34:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-910863108',display_name='tempest-TestNetworkBasicOps-server-910863108',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-910863108',id=8,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOdUD/Bw66VXONgSaCPfh7FaRXz8ERKM2oVwL5Yz4c7ndcm8vXRzyK/LRmZ1Zg9IfI0uWKxlWNWH+xbU+ASdmRuGc9K5SqYIlXjvmJ306dy4DOUzL9AVpBmhCgZzAIMG6Q==',key_name='tempest-TestNetworkBasicOps-1601711742',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-veo80t4g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:34:23Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=5b203313-4460-4f2b-b0d0-29e1846079ed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "address": "fa:16:3e:c9:d0:d4", "network": {"id": "1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0", "bridge": "br-int", "label": "tempest-network-smoke--1447073493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap278d4f1d-34", "ovs_interfaceid": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.769 186483 DEBUG nova.network.os_vif_util [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "address": "fa:16:3e:c9:d0:d4", "network": {"id": "1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0", "bridge": "br-int", "label": "tempest-network-smoke--1447073493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap278d4f1d-34", "ovs_interfaceid": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.771 186483 DEBUG nova.network.os_vif_util [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:d0:d4,bridge_name='br-int',has_traffic_filtering=True,id=278d4f1d-34a9-436c-a8c6-104778e90a0f,network=Network(1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap278d4f1d-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.772 186483 DEBUG nova.objects.instance [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'pci_devices' on Instance uuid 5b203313-4460-4f2b-b0d0-29e1846079ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.786 186483 DEBUG nova.virt.libvirt.driver [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] End _get_guest_xml xml=<domain type="kvm">
Feb 17 17:34:27 compute-0 nova_compute[186479]:   <uuid>5b203313-4460-4f2b-b0d0-29e1846079ed</uuid>
Feb 17 17:34:27 compute-0 nova_compute[186479]:   <name>instance-00000008</name>
Feb 17 17:34:27 compute-0 nova_compute[186479]:   <memory>131072</memory>
Feb 17 17:34:27 compute-0 nova_compute[186479]:   <vcpu>1</vcpu>
Feb 17 17:34:27 compute-0 nova_compute[186479]:   <metadata>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <nova:name>tempest-TestNetworkBasicOps-server-910863108</nova:name>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <nova:creationTime>2026-02-17 17:34:27</nova:creationTime>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <nova:flavor name="m1.nano">
Feb 17 17:34:27 compute-0 nova_compute[186479]:         <nova:memory>128</nova:memory>
Feb 17 17:34:27 compute-0 nova_compute[186479]:         <nova:disk>1</nova:disk>
Feb 17 17:34:27 compute-0 nova_compute[186479]:         <nova:swap>0</nova:swap>
Feb 17 17:34:27 compute-0 nova_compute[186479]:         <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:34:27 compute-0 nova_compute[186479]:         <nova:vcpus>1</nova:vcpus>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       </nova:flavor>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <nova:owner>
Feb 17 17:34:27 compute-0 nova_compute[186479]:         <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:34:27 compute-0 nova_compute[186479]:         <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       </nova:owner>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <nova:ports>
Feb 17 17:34:27 compute-0 nova_compute[186479]:         <nova:port uuid="278d4f1d-34a9-436c-a8c6-104778e90a0f">
Feb 17 17:34:27 compute-0 nova_compute[186479]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:         </nova:port>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       </nova:ports>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     </nova:instance>
Feb 17 17:34:27 compute-0 nova_compute[186479]:   </metadata>
Feb 17 17:34:27 compute-0 nova_compute[186479]:   <sysinfo type="smbios">
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <system>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <entry name="manufacturer">RDO</entry>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <entry name="product">OpenStack Compute</entry>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <entry name="serial">5b203313-4460-4f2b-b0d0-29e1846079ed</entry>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <entry name="uuid">5b203313-4460-4f2b-b0d0-29e1846079ed</entry>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <entry name="family">Virtual Machine</entry>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     </system>
Feb 17 17:34:27 compute-0 nova_compute[186479]:   </sysinfo>
Feb 17 17:34:27 compute-0 nova_compute[186479]:   <os>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <boot dev="hd"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <smbios mode="sysinfo"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:   </os>
Feb 17 17:34:27 compute-0 nova_compute[186479]:   <features>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <acpi/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <apic/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <vmcoreinfo/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:   </features>
Feb 17 17:34:27 compute-0 nova_compute[186479]:   <clock offset="utc">
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <timer name="pit" tickpolicy="delay"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <timer name="hpet" present="no"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:   </clock>
Feb 17 17:34:27 compute-0 nova_compute[186479]:   <cpu mode="host-model" match="exact">
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <topology sockets="1" cores="1" threads="1"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:34:27 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <disk type="file" device="disk">
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/5b203313-4460-4f2b-b0d0-29e1846079ed/disk"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <target dev="vda" bus="virtio"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <disk type="file" device="cdrom">
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <driver name="qemu" type="raw" cache="none"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/5b203313-4460-4f2b-b0d0-29e1846079ed/disk.config"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <target dev="sda" bus="sata"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <interface type="ethernet">
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <mac address="fa:16:3e:c9:d0:d4"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <driver name="vhost" rx_queue_size="512"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <mtu size="1442"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <target dev="tap278d4f1d-34"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <serial type="pty">
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <log file="/var/lib/nova/instances/5b203313-4460-4f2b-b0d0-29e1846079ed/console.log" append="off"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     </serial>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <video>
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     </video>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <input type="tablet" bus="usb"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <rng model="virtio">
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <backend model="random">/dev/urandom</backend>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <controller type="usb" index="0"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     <memballoon model="virtio">
Feb 17 17:34:27 compute-0 nova_compute[186479]:       <stats period="10"/>
Feb 17 17:34:27 compute-0 nova_compute[186479]:     </memballoon>
Feb 17 17:34:27 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:34:27 compute-0 nova_compute[186479]: </domain>
Feb 17 17:34:27 compute-0 nova_compute[186479]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.788 186483 DEBUG nova.compute.manager [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Preparing to wait for external event network-vif-plugged-278d4f1d-34a9-436c-a8c6-104778e90a0f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.788 186483 DEBUG oslo_concurrency.lockutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "5b203313-4460-4f2b-b0d0-29e1846079ed-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.788 186483 DEBUG oslo_concurrency.lockutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "5b203313-4460-4f2b-b0d0-29e1846079ed-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.789 186483 DEBUG oslo_concurrency.lockutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "5b203313-4460-4f2b-b0d0-29e1846079ed-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.789 186483 DEBUG nova.virt.libvirt.vif [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:34:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-910863108',display_name='tempest-TestNetworkBasicOps-server-910863108',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-910863108',id=8,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOdUD/Bw66VXONgSaCPfh7FaRXz8ERKM2oVwL5Yz4c7ndcm8vXRzyK/LRmZ1Zg9IfI0uWKxlWNWH+xbU+ASdmRuGc9K5SqYIlXjvmJ306dy4DOUzL9AVpBmhCgZzAIMG6Q==',key_name='tempest-TestNetworkBasicOps-1601711742',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-veo80t4g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:34:23Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=5b203313-4460-4f2b-b0d0-29e1846079ed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "address": "fa:16:3e:c9:d0:d4", "network": {"id": "1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0", "bridge": "br-int", "label": "tempest-network-smoke--1447073493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap278d4f1d-34", "ovs_interfaceid": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.790 186483 DEBUG nova.network.os_vif_util [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "address": "fa:16:3e:c9:d0:d4", "network": {"id": "1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0", "bridge": "br-int", "label": "tempest-network-smoke--1447073493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap278d4f1d-34", "ovs_interfaceid": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.790 186483 DEBUG nova.network.os_vif_util [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:d0:d4,bridge_name='br-int',has_traffic_filtering=True,id=278d4f1d-34a9-436c-a8c6-104778e90a0f,network=Network(1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap278d4f1d-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.791 186483 DEBUG os_vif [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:d0:d4,bridge_name='br-int',has_traffic_filtering=True,id=278d4f1d-34a9-436c-a8c6-104778e90a0f,network=Network(1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap278d4f1d-34') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.791 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.791 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.792 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.795 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.795 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap278d4f1d-34, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.796 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap278d4f1d-34, col_values=(('external_ids', {'iface-id': '278d4f1d-34a9-436c-a8c6-104778e90a0f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c9:d0:d4', 'vm-uuid': '5b203313-4460-4f2b-b0d0-29e1846079ed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.820 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:27 compute-0 NetworkManager[56323]: <info>  [1771349667.8219] manager: (tap278d4f1d-34): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.823 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.829 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.830 186483 INFO os_vif [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:d0:d4,bridge_name='br-int',has_traffic_filtering=True,id=278d4f1d-34a9-436c-a8c6-104778e90a0f,network=Network(1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap278d4f1d-34')
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.871 186483 DEBUG nova.virt.libvirt.driver [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.872 186483 DEBUG nova.virt.libvirt.driver [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.872 186483 DEBUG nova.virt.libvirt.driver [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No VIF found with MAC fa:16:3e:c9:d0:d4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 17 17:34:27 compute-0 nova_compute[186479]: 2026-02-17 17:34:27.872 186483 INFO nova.virt.libvirt.driver [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Using config drive
Feb 17 17:34:28 compute-0 nova_compute[186479]: 2026-02-17 17:34:28.510 186483 INFO nova.virt.libvirt.driver [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Creating config drive at /var/lib/nova/instances/5b203313-4460-4f2b-b0d0-29e1846079ed/disk.config
Feb 17 17:34:28 compute-0 nova_compute[186479]: 2026-02-17 17:34:28.518 186483 DEBUG oslo_concurrency.processutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5b203313-4460-4f2b-b0d0-29e1846079ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpzhjijin_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:34:28 compute-0 nova_compute[186479]: 2026-02-17 17:34:28.645 186483 DEBUG oslo_concurrency.processutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5b203313-4460-4f2b-b0d0-29e1846079ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpzhjijin_" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:34:28 compute-0 kernel: tap278d4f1d-34: entered promiscuous mode
Feb 17 17:34:28 compute-0 NetworkManager[56323]: <info>  [1771349668.6917] manager: (tap278d4f1d-34): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Feb 17 17:34:28 compute-0 ovn_controller[96568]: 2026-02-17T17:34:28Z|00117|binding|INFO|Claiming lport 278d4f1d-34a9-436c-a8c6-104778e90a0f for this chassis.
Feb 17 17:34:28 compute-0 ovn_controller[96568]: 2026-02-17T17:34:28Z|00118|binding|INFO|278d4f1d-34a9-436c-a8c6-104778e90a0f: Claiming fa:16:3e:c9:d0:d4 10.100.0.13
Feb 17 17:34:28 compute-0 nova_compute[186479]: 2026-02-17 17:34:28.692 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:28 compute-0 nova_compute[186479]: 2026-02-17 17:34:28.694 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.709 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:d0:d4 10.100.0.13'], port_security=['fa:16:3e:c9:d0:d4 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1110780057', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5b203313-4460-4f2b-b0d0-29e1846079ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1110780057', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4ea7d121-7f67-4d41-b55a-38229e1e4d1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80559a54-f2f5-4114-9204-16477dc05b22, chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=278d4f1d-34a9-436c-a8c6-104778e90a0f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.711 105898 INFO neutron.agent.ovn.metadata.agent [-] Port 278d4f1d-34a9-436c-a8c6-104778e90a0f in datapath 1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0 bound to our chassis
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.712 105898 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0
Feb 17 17:34:28 compute-0 systemd-udevd[218481]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:34:28 compute-0 nova_compute[186479]: 2026-02-17 17:34:28.721 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.722 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[b5fc34b3-529d-4c43-bf59-4df0080be8fb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.723 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1551a1a1-41 in ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.725 215208 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1551a1a1-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.725 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[1357c2c3-263a-40de-9179-7d2cd99ad0b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:28 compute-0 ovn_controller[96568]: 2026-02-17T17:34:28Z|00119|binding|INFO|Setting lport 278d4f1d-34a9-436c-a8c6-104778e90a0f ovn-installed in OVS
Feb 17 17:34:28 compute-0 ovn_controller[96568]: 2026-02-17T17:34:28Z|00120|binding|INFO|Setting lport 278d4f1d-34a9-436c-a8c6-104778e90a0f up in Southbound
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.726 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[a921ffa4-83a6-40d6-a744-3a623666bbe8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:28 compute-0 nova_compute[186479]: 2026-02-17 17:34:28.729 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:28 compute-0 systemd-machined[155877]: New machine qemu-8-instance-00000008.
Feb 17 17:34:28 compute-0 NetworkManager[56323]: <info>  [1771349668.7374] device (tap278d4f1d-34): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 17 17:34:28 compute-0 NetworkManager[56323]: <info>  [1771349668.7382] device (tap278d4f1d-34): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.737 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[0a32d2a5-4f65-4fae-8242-5f50cb179c55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:28 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-00000008.
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.751 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[6e1f7846-bd98-4636-8d54-8a39e1c41244]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.778 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[2d081c2a-902a-404e-9cd2-c0f49e4f7e8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:28 compute-0 NetworkManager[56323]: <info>  [1771349668.7863] manager: (tap1551a1a1-40): new Veth device (/org/freedesktop/NetworkManager/Devices/64)
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.786 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[cfa1c6ad-8882-4900-974a-c8cbe4c31c7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.807 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[acdb5f0e-a6ad-4405-84f5-ceaab0144329]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.811 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[d74265c2-2a6a-4983-9aae-618744e48ebf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:28 compute-0 NetworkManager[56323]: <info>  [1771349668.8312] device (tap1551a1a1-40): carrier: link connected
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.835 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[f76cf14c-a3fe-4b2b-9e6f-d5236f7420c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.851 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[ce0ddee7-9dfc-4437-84d5-d0dff26cd657]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1551a1a1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:a0:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 342414, 'reachable_time': 18647, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218514, 'error': None, 'target': 'ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.864 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[252cb9d1-9633-45c3-ac30-aa48190662c1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed4:a0f5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 342414, 'tstamp': 342414}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218515, 'error': None, 'target': 'ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.879 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[fa33b1bb-2fec-4b84-b099-e218a10725f4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1551a1a1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:a0:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 342414, 'reachable_time': 18647, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218516, 'error': None, 'target': 'ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.904 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[c93b74cc-cbcf-4b6e-a899-f6b4697515db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.946 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[3be3cf55-3ba3-4084-87f3-39d60071d7ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.947 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1551a1a1-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.947 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.948 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1551a1a1-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:34:28 compute-0 kernel: tap1551a1a1-40: entered promiscuous mode
Feb 17 17:34:28 compute-0 nova_compute[186479]: 2026-02-17 17:34:28.949 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:28 compute-0 NetworkManager[56323]: <info>  [1771349668.9499] manager: (tap1551a1a1-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.953 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1551a1a1-40, col_values=(('external_ids', {'iface-id': '694e808e-a1c9-4900-a806-bd631fbb0446'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:34:28 compute-0 nova_compute[186479]: 2026-02-17 17:34:28.954 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:28 compute-0 nova_compute[186479]: 2026-02-17 17:34:28.955 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:28 compute-0 ovn_controller[96568]: 2026-02-17T17:34:28Z|00121|binding|INFO|Releasing lport 694e808e-a1c9-4900-a806-bd631fbb0446 from this chassis (sb_readonly=0)
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.955 105898 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.956 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[3d08520a-f999-43ac-bcfb-26f09d094b0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.957 105898 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: global
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:     log         /dev/log local0 debug
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:     log-tag     haproxy-metadata-proxy-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:     user        root
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:     group       root
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:     maxconn     1024
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:     pidfile     /var/lib/neutron/external/pids/1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0.pid.haproxy
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:     daemon
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: defaults
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:     log global
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:     mode http
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:     option httplog
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:     option dontlognull
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:     option http-server-close
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:     option forwardfor
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:     retries                 3
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:     timeout http-request    30s
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:     timeout connect         30s
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:     timeout client          32s
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:     timeout server          32s
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:     timeout http-keep-alive 30s
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: listen listener
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:     bind 169.254.169.254:80
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:     server metadata /var/lib/neutron/metadata_proxy
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:     http-request add-header X-OVN-Network-ID 1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 17 17:34:28 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:28.957 105898 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0', 'env', 'PROCESS_TAG=haproxy-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 17 17:34:28 compute-0 nova_compute[186479]: 2026-02-17 17:34:28.960 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.163 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349669.162588, 5b203313-4460-4f2b-b0d0-29e1846079ed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.163 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] VM Started (Lifecycle Event)
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.188 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.192 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349669.1631694, 5b203313-4460-4f2b-b0d0-29e1846079ed => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.192 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] VM Paused (Lifecycle Event)
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.212 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.215 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.236 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:34:29 compute-0 podman[218555]: 2026-02-17 17:34:29.277506223 +0000 UTC m=+0.049311435 container create e771aba0f5cbddcd7ac2732be30e7ea1fd37ad72b95e8aa6ae25d2812f2ed7e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 17 17:34:29 compute-0 systemd[1]: Started libpod-conmon-e771aba0f5cbddcd7ac2732be30e7ea1fd37ad72b95e8aa6ae25d2812f2ed7e4.scope.
Feb 17 17:34:29 compute-0 podman[218555]: 2026-02-17 17:34:29.250777755 +0000 UTC m=+0.022582997 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 17 17:34:29 compute-0 systemd[1]: Started libcrun container.
Feb 17 17:34:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89517c1589e7f0f721b34a2831bf4e4a7e1a05196f3b1ef611f729dc6237b8e0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 17 17:34:29 compute-0 podman[218555]: 2026-02-17 17:34:29.372276787 +0000 UTC m=+0.144082069 container init e771aba0f5cbddcd7ac2732be30e7ea1fd37ad72b95e8aa6ae25d2812f2ed7e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 17 17:34:29 compute-0 podman[218555]: 2026-02-17 17:34:29.380722465 +0000 UTC m=+0.152527717 container start e771aba0f5cbddcd7ac2732be30e7ea1fd37ad72b95e8aa6ae25d2812f2ed7e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 17 17:34:29 compute-0 neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0[218570]: [NOTICE]   (218574) : New worker (218576) forked
Feb 17 17:34:29 compute-0 neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0[218570]: [NOTICE]   (218574) : Loading success.
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.451 186483 DEBUG nova.compute.manager [req-59c244ec-60a9-4eae-a702-157a8b8aac15 req-ee7aa3a3-bb3b-4172-bb42-3407712cb3fb 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Received event network-vif-plugged-278d4f1d-34a9-436c-a8c6-104778e90a0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.452 186483 DEBUG oslo_concurrency.lockutils [req-59c244ec-60a9-4eae-a702-157a8b8aac15 req-ee7aa3a3-bb3b-4172-bb42-3407712cb3fb 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "5b203313-4460-4f2b-b0d0-29e1846079ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.452 186483 DEBUG oslo_concurrency.lockutils [req-59c244ec-60a9-4eae-a702-157a8b8aac15 req-ee7aa3a3-bb3b-4172-bb42-3407712cb3fb 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "5b203313-4460-4f2b-b0d0-29e1846079ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.453 186483 DEBUG oslo_concurrency.lockutils [req-59c244ec-60a9-4eae-a702-157a8b8aac15 req-ee7aa3a3-bb3b-4172-bb42-3407712cb3fb 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "5b203313-4460-4f2b-b0d0-29e1846079ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.453 186483 DEBUG nova.compute.manager [req-59c244ec-60a9-4eae-a702-157a8b8aac15 req-ee7aa3a3-bb3b-4172-bb42-3407712cb3fb 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Processing event network-vif-plugged-278d4f1d-34a9-436c-a8c6-104778e90a0f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.454 186483 DEBUG nova.compute.manager [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.459 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349669.4594564, 5b203313-4460-4f2b-b0d0-29e1846079ed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.459 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] VM Resumed (Lifecycle Event)
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.462 186483 DEBUG nova.virt.libvirt.driver [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.465 186483 INFO nova.virt.libvirt.driver [-] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Instance spawned successfully.
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.465 186483 DEBUG nova.virt.libvirt.driver [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.502 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.509 186483 DEBUG nova.virt.libvirt.driver [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.510 186483 DEBUG nova.virt.libvirt.driver [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.510 186483 DEBUG nova.virt.libvirt.driver [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.510 186483 DEBUG nova.virt.libvirt.driver [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.511 186483 DEBUG nova.virt.libvirt.driver [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.511 186483 DEBUG nova.virt.libvirt.driver [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.518 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.566 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.590 186483 INFO nova.compute.manager [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Took 5.89 seconds to spawn the instance on the hypervisor.
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.590 186483 DEBUG nova.compute.manager [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.615 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:29.616 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '7a:bf:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:a1:c8:7d:f2:63'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:34:29 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:29.617 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.654 186483 INFO nova.compute.manager [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Took 6.48 seconds to build instance.
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.668 186483 DEBUG oslo_concurrency.lockutils [None req-8b352cd2-ce68-4eb5-be35-bb3e91c7b265 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "5b203313-4460-4f2b-b0d0-29e1846079ed" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.557s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.720 186483 DEBUG nova.network.neutron [req-13b89fdd-b79a-4f85-bcbb-cb38577250eb req-13832795-9d22-4e5b-99c9-0c205b85e044 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Updated VIF entry in instance network info cache for port 278d4f1d-34a9-436c-a8c6-104778e90a0f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.722 186483 DEBUG nova.network.neutron [req-13b89fdd-b79a-4f85-bcbb-cb38577250eb req-13832795-9d22-4e5b-99c9-0c205b85e044 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Updating instance_info_cache with network_info: [{"id": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "address": "fa:16:3e:c9:d0:d4", "network": {"id": "1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0", "bridge": "br-int", "label": "tempest-network-smoke--1447073493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap278d4f1d-34", "ovs_interfaceid": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.740 186483 DEBUG oslo_concurrency.lockutils [req-13b89fdd-b79a-4f85-bcbb-cb38577250eb req-13832795-9d22-4e5b-99c9-0c205b85e044 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-5b203313-4460-4f2b-b0d0-29e1846079ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:34:29 compute-0 nova_compute[186479]: 2026-02-17 17:34:29.925 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:31 compute-0 nova_compute[186479]: 2026-02-17 17:34:31.548 186483 DEBUG nova.compute.manager [req-59a9c623-a3a1-4860-87b9-8fd8056c0a09 req-400f48c5-45dc-46af-8b8c-d9fcb73c9906 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Received event network-vif-plugged-278d4f1d-34a9-436c-a8c6-104778e90a0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:34:31 compute-0 nova_compute[186479]: 2026-02-17 17:34:31.549 186483 DEBUG oslo_concurrency.lockutils [req-59a9c623-a3a1-4860-87b9-8fd8056c0a09 req-400f48c5-45dc-46af-8b8c-d9fcb73c9906 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "5b203313-4460-4f2b-b0d0-29e1846079ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:31 compute-0 nova_compute[186479]: 2026-02-17 17:34:31.549 186483 DEBUG oslo_concurrency.lockutils [req-59a9c623-a3a1-4860-87b9-8fd8056c0a09 req-400f48c5-45dc-46af-8b8c-d9fcb73c9906 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "5b203313-4460-4f2b-b0d0-29e1846079ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:31 compute-0 nova_compute[186479]: 2026-02-17 17:34:31.550 186483 DEBUG oslo_concurrency.lockutils [req-59a9c623-a3a1-4860-87b9-8fd8056c0a09 req-400f48c5-45dc-46af-8b8c-d9fcb73c9906 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "5b203313-4460-4f2b-b0d0-29e1846079ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:31 compute-0 nova_compute[186479]: 2026-02-17 17:34:31.550 186483 DEBUG nova.compute.manager [req-59a9c623-a3a1-4860-87b9-8fd8056c0a09 req-400f48c5-45dc-46af-8b8c-d9fcb73c9906 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] No waiting events found dispatching network-vif-plugged-278d4f1d-34a9-436c-a8c6-104778e90a0f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:34:31 compute-0 nova_compute[186479]: 2026-02-17 17:34:31.551 186483 WARNING nova.compute.manager [req-59a9c623-a3a1-4860-87b9-8fd8056c0a09 req-400f48c5-45dc-46af-8b8c-d9fcb73c9906 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Received unexpected event network-vif-plugged-278d4f1d-34a9-436c-a8c6-104778e90a0f for instance with vm_state active and task_state None.
Feb 17 17:34:32 compute-0 podman[218585]: 2026-02-17 17:34:32.732583907 +0000 UTC m=+0.072450735 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 17 17:34:32 compute-0 nova_compute[186479]: 2026-02-17 17:34:32.820 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:33 compute-0 nova_compute[186479]: 2026-02-17 17:34:33.374 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:33 compute-0 NetworkManager[56323]: <info>  [1771349673.3777] manager: (patch-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Feb 17 17:34:33 compute-0 NetworkManager[56323]: <info>  [1771349673.3785] manager: (patch-br-int-to-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Feb 17 17:34:33 compute-0 ovn_controller[96568]: 2026-02-17T17:34:33Z|00122|binding|INFO|Releasing lport 694e808e-a1c9-4900-a806-bd631fbb0446 from this chassis (sb_readonly=0)
Feb 17 17:34:33 compute-0 nova_compute[186479]: 2026-02-17 17:34:33.383 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.093 186483 DEBUG nova.compute.manager [req-0c6bea7d-2790-4b4a-b67d-58349a43aea1 req-327d62ce-d755-401d-9c16-6c9b558aa5a1 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Received event network-changed-278d4f1d-34a9-436c-a8c6-104778e90a0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.093 186483 DEBUG nova.compute.manager [req-0c6bea7d-2790-4b4a-b67d-58349a43aea1 req-327d62ce-d755-401d-9c16-6c9b558aa5a1 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Refreshing instance network info cache due to event network-changed-278d4f1d-34a9-436c-a8c6-104778e90a0f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.093 186483 DEBUG oslo_concurrency.lockutils [req-0c6bea7d-2790-4b4a-b67d-58349a43aea1 req-327d62ce-d755-401d-9c16-6c9b558aa5a1 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-5b203313-4460-4f2b-b0d0-29e1846079ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.094 186483 DEBUG oslo_concurrency.lockutils [req-0c6bea7d-2790-4b4a-b67d-58349a43aea1 req-327d62ce-d755-401d-9c16-6c9b558aa5a1 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-5b203313-4460-4f2b-b0d0-29e1846079ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.094 186483 DEBUG nova.network.neutron [req-0c6bea7d-2790-4b4a-b67d-58349a43aea1 req-327d62ce-d755-401d-9c16-6c9b558aa5a1 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Refreshing network info cache for port 278d4f1d-34a9-436c-a8c6-104778e90a0f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.250 186483 DEBUG oslo_concurrency.lockutils [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "5b203313-4460-4f2b-b0d0-29e1846079ed" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.251 186483 DEBUG oslo_concurrency.lockutils [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "5b203313-4460-4f2b-b0d0-29e1846079ed" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.251 186483 DEBUG oslo_concurrency.lockutils [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "5b203313-4460-4f2b-b0d0-29e1846079ed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.251 186483 DEBUG oslo_concurrency.lockutils [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "5b203313-4460-4f2b-b0d0-29e1846079ed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.251 186483 DEBUG oslo_concurrency.lockutils [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "5b203313-4460-4f2b-b0d0-29e1846079ed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.252 186483 INFO nova.compute.manager [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Terminating instance
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.253 186483 DEBUG nova.compute.manager [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 17 17:34:34 compute-0 kernel: tap278d4f1d-34 (unregistering): left promiscuous mode
Feb 17 17:34:34 compute-0 NetworkManager[56323]: <info>  [1771349674.2699] device (tap278d4f1d-34): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.273 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:34 compute-0 ovn_controller[96568]: 2026-02-17T17:34:34Z|00123|binding|INFO|Releasing lport 278d4f1d-34a9-436c-a8c6-104778e90a0f from this chassis (sb_readonly=0)
Feb 17 17:34:34 compute-0 ovn_controller[96568]: 2026-02-17T17:34:34Z|00124|binding|INFO|Setting lport 278d4f1d-34a9-436c-a8c6-104778e90a0f down in Southbound
Feb 17 17:34:34 compute-0 ovn_controller[96568]: 2026-02-17T17:34:34Z|00125|binding|INFO|Removing iface tap278d4f1d-34 ovn-installed in OVS
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.275 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:34 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:34.281 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:d0:d4 10.100.0.13'], port_security=['fa:16:3e:c9:d0:d4 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1110780057', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5b203313-4460-4f2b-b0d0-29e1846079ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1110780057', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4ea7d121-7f67-4d41-b55a-38229e1e4d1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.215'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80559a54-f2f5-4114-9204-16477dc05b22, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=278d4f1d-34a9-436c-a8c6-104778e90a0f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:34:34 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:34.283 105898 INFO neutron.agent.ovn.metadata.agent [-] Port 278d4f1d-34a9-436c-a8c6-104778e90a0f in datapath 1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0 unbound from our chassis
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.283 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:34 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:34.284 105898 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 17 17:34:34 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:34.285 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[b678691f-e371-4fcd-a6f6-6fa14dddfb75]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:34 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:34.286 105898 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0 namespace which is not needed anymore
Feb 17 17:34:34 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Deactivated successfully.
Feb 17 17:34:34 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Consumed 5.304s CPU time.
Feb 17 17:34:34 compute-0 systemd-machined[155877]: Machine qemu-8-instance-00000008 terminated.
Feb 17 17:34:34 compute-0 neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0[218570]: [NOTICE]   (218574) : haproxy version is 2.8.14-c23fe91
Feb 17 17:34:34 compute-0 neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0[218570]: [NOTICE]   (218574) : path to executable is /usr/sbin/haproxy
Feb 17 17:34:34 compute-0 neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0[218570]: [WARNING]  (218574) : Exiting Master process...
Feb 17 17:34:34 compute-0 neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0[218570]: [ALERT]    (218574) : Current worker (218576) exited with code 143 (Terminated)
Feb 17 17:34:34 compute-0 neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0[218570]: [WARNING]  (218574) : All workers exited. Exiting... (0)
Feb 17 17:34:34 compute-0 systemd[1]: libpod-e771aba0f5cbddcd7ac2732be30e7ea1fd37ad72b95e8aa6ae25d2812f2ed7e4.scope: Deactivated successfully.
Feb 17 17:34:34 compute-0 podman[218634]: 2026-02-17 17:34:34.40464696 +0000 UTC m=+0.040929261 container died e771aba0f5cbddcd7ac2732be30e7ea1fd37ad72b95e8aa6ae25d2812f2ed7e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:34:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e771aba0f5cbddcd7ac2732be30e7ea1fd37ad72b95e8aa6ae25d2812f2ed7e4-userdata-shm.mount: Deactivated successfully.
Feb 17 17:34:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-89517c1589e7f0f721b34a2831bf4e4a7e1a05196f3b1ef611f729dc6237b8e0-merged.mount: Deactivated successfully.
Feb 17 17:34:34 compute-0 podman[218634]: 2026-02-17 17:34:34.446832809 +0000 UTC m=+0.083115130 container cleanup e771aba0f5cbddcd7ac2732be30e7ea1fd37ad72b95e8aa6ae25d2812f2ed7e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb 17 17:34:34 compute-0 systemd[1]: libpod-conmon-e771aba0f5cbddcd7ac2732be30e7ea1fd37ad72b95e8aa6ae25d2812f2ed7e4.scope: Deactivated successfully.
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.470 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.474 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.503 186483 DEBUG nova.compute.manager [req-17d3a92c-b885-4f6d-b9a2-0d4a19fdd6b7 req-1e27daff-11e9-4318-a1c5-3009746d1cf9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Received event network-vif-unplugged-278d4f1d-34a9-436c-a8c6-104778e90a0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.504 186483 DEBUG oslo_concurrency.lockutils [req-17d3a92c-b885-4f6d-b9a2-0d4a19fdd6b7 req-1e27daff-11e9-4318-a1c5-3009746d1cf9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "5b203313-4460-4f2b-b0d0-29e1846079ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.504 186483 DEBUG oslo_concurrency.lockutils [req-17d3a92c-b885-4f6d-b9a2-0d4a19fdd6b7 req-1e27daff-11e9-4318-a1c5-3009746d1cf9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "5b203313-4460-4f2b-b0d0-29e1846079ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.504 186483 DEBUG oslo_concurrency.lockutils [req-17d3a92c-b885-4f6d-b9a2-0d4a19fdd6b7 req-1e27daff-11e9-4318-a1c5-3009746d1cf9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "5b203313-4460-4f2b-b0d0-29e1846079ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.504 186483 DEBUG nova.compute.manager [req-17d3a92c-b885-4f6d-b9a2-0d4a19fdd6b7 req-1e27daff-11e9-4318-a1c5-3009746d1cf9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] No waiting events found dispatching network-vif-unplugged-278d4f1d-34a9-436c-a8c6-104778e90a0f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.505 186483 DEBUG nova.compute.manager [req-17d3a92c-b885-4f6d-b9a2-0d4a19fdd6b7 req-1e27daff-11e9-4318-a1c5-3009746d1cf9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Received event network-vif-unplugged-278d4f1d-34a9-436c-a8c6-104778e90a0f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.510 186483 INFO nova.virt.libvirt.driver [-] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Instance destroyed successfully.
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.510 186483 DEBUG nova.objects.instance [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'resources' on Instance uuid 5b203313-4460-4f2b-b0d0-29e1846079ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.523 186483 DEBUG nova.virt.libvirt.vif [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:34:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-910863108',display_name='tempest-TestNetworkBasicOps-server-910863108',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-910863108',id=8,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOdUD/Bw66VXONgSaCPfh7FaRXz8ERKM2oVwL5Yz4c7ndcm8vXRzyK/LRmZ1Zg9IfI0uWKxlWNWH+xbU+ASdmRuGc9K5SqYIlXjvmJ306dy4DOUzL9AVpBmhCgZzAIMG6Q==',key_name='tempest-TestNetworkBasicOps-1601711742',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:34:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-veo80t4g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:34:29Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=5b203313-4460-4f2b-b0d0-29e1846079ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "address": "fa:16:3e:c9:d0:d4", "network": {"id": "1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0", "bridge": "br-int", "label": "tempest-network-smoke--1447073493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap278d4f1d-34", "ovs_interfaceid": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.523 186483 DEBUG nova.network.os_vif_util [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "address": "fa:16:3e:c9:d0:d4", "network": {"id": "1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0", "bridge": "br-int", "label": "tempest-network-smoke--1447073493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap278d4f1d-34", "ovs_interfaceid": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.524 186483 DEBUG nova.network.os_vif_util [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:d0:d4,bridge_name='br-int',has_traffic_filtering=True,id=278d4f1d-34a9-436c-a8c6-104778e90a0f,network=Network(1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap278d4f1d-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.524 186483 DEBUG os_vif [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:d0:d4,bridge_name='br-int',has_traffic_filtering=True,id=278d4f1d-34a9-436c-a8c6-104778e90a0f,network=Network(1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap278d4f1d-34') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.525 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.526 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap278d4f1d-34, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.527 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:34 compute-0 podman[218664]: 2026-02-17 17:34:34.52928009 +0000 UTC m=+0.064865320 container remove e771aba0f5cbddcd7ac2732be30e7ea1fd37ad72b95e8aa6ae25d2812f2ed7e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.530 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.532 186483 INFO os_vif [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:d0:d4,bridge_name='br-int',has_traffic_filtering=True,id=278d4f1d-34a9-436c-a8c6-104778e90a0f,network=Network(1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap278d4f1d-34')
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.532 186483 INFO nova.virt.libvirt.driver [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Deleting instance files /var/lib/nova/instances/5b203313-4460-4f2b-b0d0-29e1846079ed_del
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.533 186483 INFO nova.virt.libvirt.driver [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Deletion of /var/lib/nova/instances/5b203313-4460-4f2b-b0d0-29e1846079ed_del complete
Feb 17 17:34:34 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:34.533 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[f0306c42-ad2b-4ea3-8be7-992e5718144a]: (4, ('Tue Feb 17 05:34:34 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0 (e771aba0f5cbddcd7ac2732be30e7ea1fd37ad72b95e8aa6ae25d2812f2ed7e4)\ne771aba0f5cbddcd7ac2732be30e7ea1fd37ad72b95e8aa6ae25d2812f2ed7e4\nTue Feb 17 05:34:34 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0 (e771aba0f5cbddcd7ac2732be30e7ea1fd37ad72b95e8aa6ae25d2812f2ed7e4)\ne771aba0f5cbddcd7ac2732be30e7ea1fd37ad72b95e8aa6ae25d2812f2ed7e4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:34 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:34.534 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[abdb4b21-2cd1-4a48-82df-247ec8ae31fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:34 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:34.535 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1551a1a1-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:34:34 compute-0 kernel: tap1551a1a1-40: left promiscuous mode
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.537 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.541 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:34 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:34.542 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[4691eea0-a4c3-4998-b36f-05853f86ce67]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:34 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:34.560 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[c7af24a5-5eba-4d09-8a35-908715fdf302]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:34 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:34.562 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[7d7bb7a4-b918-4611-b5ed-b86f3b3c25b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:34 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:34.574 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[6df1060b-e3a0-46b1-a21b-4ca7a8e176f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 342408, 'reachable_time': 42819, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218695, 'error': None, 'target': 'ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:34 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:34.576 106424 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 17 17:34:34 compute-0 systemd[1]: run-netns-ovnmeta\x2d1551a1a1\x2d4fd7\x2d4ca1\x2dbc7e\x2d1ca283b384c0.mount: Deactivated successfully.
Feb 17 17:34:34 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:34.577 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[5a7a1a73-9413-4a93-a66e-858ef8a75e7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.581 186483 INFO nova.compute.manager [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Took 0.33 seconds to destroy the instance on the hypervisor.
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.582 186483 DEBUG oslo.service.loopingcall [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.582 186483 DEBUG nova.compute.manager [-] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.582 186483 DEBUG nova.network.neutron [-] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 17 17:34:34 compute-0 nova_compute[186479]: 2026-02-17 17:34:34.927 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:35 compute-0 nova_compute[186479]: 2026-02-17 17:34:35.100 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:34:35 compute-0 nova_compute[186479]: 2026-02-17 17:34:35.122 186483 WARNING nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.
Feb 17 17:34:35 compute-0 nova_compute[186479]: 2026-02-17 17:34:35.123 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Triggering sync for uuid 5b203313-4460-4f2b-b0d0-29e1846079ed _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 17 17:34:35 compute-0 nova_compute[186479]: 2026-02-17 17:34:35.124 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "5b203313-4460-4f2b-b0d0-29e1846079ed" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:35 compute-0 nova_compute[186479]: 2026-02-17 17:34:35.600 186483 DEBUG nova.network.neutron [req-0c6bea7d-2790-4b4a-b67d-58349a43aea1 req-327d62ce-d755-401d-9c16-6c9b558aa5a1 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Updated VIF entry in instance network info cache for port 278d4f1d-34a9-436c-a8c6-104778e90a0f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:34:35 compute-0 nova_compute[186479]: 2026-02-17 17:34:35.601 186483 DEBUG nova.network.neutron [req-0c6bea7d-2790-4b4a-b67d-58349a43aea1 req-327d62ce-d755-401d-9c16-6c9b558aa5a1 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Updating instance_info_cache with network_info: [{"id": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "address": "fa:16:3e:c9:d0:d4", "network": {"id": "1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0", "bridge": "br-int", "label": "tempest-network-smoke--1447073493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap278d4f1d-34", "ovs_interfaceid": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:34:35 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:35.620 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ee0cee2f-3200-4f1f-8903-57b18789347d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:34:35 compute-0 nova_compute[186479]: 2026-02-17 17:34:35.631 186483 DEBUG oslo_concurrency.lockutils [req-0c6bea7d-2790-4b4a-b67d-58349a43aea1 req-327d62ce-d755-401d-9c16-6c9b558aa5a1 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-5b203313-4460-4f2b-b0d0-29e1846079ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:34:35 compute-0 nova_compute[186479]: 2026-02-17 17:34:35.813 186483 DEBUG nova.network.neutron [-] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:34:35 compute-0 nova_compute[186479]: 2026-02-17 17:34:35.833 186483 INFO nova.compute.manager [-] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Took 1.25 seconds to deallocate network for instance.
Feb 17 17:34:35 compute-0 nova_compute[186479]: 2026-02-17 17:34:35.879 186483 DEBUG oslo_concurrency.lockutils [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:35 compute-0 nova_compute[186479]: 2026-02-17 17:34:35.880 186483 DEBUG oslo_concurrency.lockutils [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:35 compute-0 nova_compute[186479]: 2026-02-17 17:34:35.945 186483 DEBUG nova.compute.provider_tree [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:34:35 compute-0 nova_compute[186479]: 2026-02-17 17:34:35.963 186483 DEBUG nova.scheduler.client.report [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:34:35 compute-0 nova_compute[186479]: 2026-02-17 17:34:35.986 186483 DEBUG oslo_concurrency.lockutils [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:36 compute-0 nova_compute[186479]: 2026-02-17 17:34:36.009 186483 INFO nova.scheduler.client.report [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Deleted allocations for instance 5b203313-4460-4f2b-b0d0-29e1846079ed
Feb 17 17:34:36 compute-0 nova_compute[186479]: 2026-02-17 17:34:36.062 186483 DEBUG oslo_concurrency.lockutils [None req-fa44b1fa-3e96-46e5-a10d-53f15ce7db9a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "5b203313-4460-4f2b-b0d0-29e1846079ed" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.812s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:36 compute-0 nova_compute[186479]: 2026-02-17 17:34:36.064 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "5b203313-4460-4f2b-b0d0-29e1846079ed" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.940s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:36 compute-0 nova_compute[186479]: 2026-02-17 17:34:36.065 186483 INFO nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] During sync_power_state the instance has a pending task (deleting). Skip.
Feb 17 17:34:36 compute-0 nova_compute[186479]: 2026-02-17 17:34:36.065 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "5b203313-4460-4f2b-b0d0-29e1846079ed" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:36 compute-0 nova_compute[186479]: 2026-02-17 17:34:36.736 186483 DEBUG nova.compute.manager [req-c5982323-9898-403a-9a37-56d53ef15405 req-e10b584a-f975-4aba-9455-22354ec361c9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Received event network-vif-plugged-278d4f1d-34a9-436c-a8c6-104778e90a0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:34:36 compute-0 nova_compute[186479]: 2026-02-17 17:34:36.736 186483 DEBUG oslo_concurrency.lockutils [req-c5982323-9898-403a-9a37-56d53ef15405 req-e10b584a-f975-4aba-9455-22354ec361c9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "5b203313-4460-4f2b-b0d0-29e1846079ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:36 compute-0 nova_compute[186479]: 2026-02-17 17:34:36.737 186483 DEBUG oslo_concurrency.lockutils [req-c5982323-9898-403a-9a37-56d53ef15405 req-e10b584a-f975-4aba-9455-22354ec361c9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "5b203313-4460-4f2b-b0d0-29e1846079ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:36 compute-0 nova_compute[186479]: 2026-02-17 17:34:36.737 186483 DEBUG oslo_concurrency.lockutils [req-c5982323-9898-403a-9a37-56d53ef15405 req-e10b584a-f975-4aba-9455-22354ec361c9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "5b203313-4460-4f2b-b0d0-29e1846079ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:36 compute-0 nova_compute[186479]: 2026-02-17 17:34:36.737 186483 DEBUG nova.compute.manager [req-c5982323-9898-403a-9a37-56d53ef15405 req-e10b584a-f975-4aba-9455-22354ec361c9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] No waiting events found dispatching network-vif-plugged-278d4f1d-34a9-436c-a8c6-104778e90a0f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:34:36 compute-0 nova_compute[186479]: 2026-02-17 17:34:36.737 186483 WARNING nova.compute.manager [req-c5982323-9898-403a-9a37-56d53ef15405 req-e10b584a-f975-4aba-9455-22354ec361c9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Received unexpected event network-vif-plugged-278d4f1d-34a9-436c-a8c6-104778e90a0f for instance with vm_state deleted and task_state None.
Feb 17 17:34:39 compute-0 nova_compute[186479]: 2026-02-17 17:34:39.528 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:39 compute-0 nova_compute[186479]: 2026-02-17 17:34:39.975 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:40 compute-0 podman[218697]: 2026-02-17 17:34:40.752777645 +0000 UTC m=+0.086675886 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.121 186483 DEBUG oslo_concurrency.lockutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "83177619-4b43-46b9-9411-0befd9730f0a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.121 186483 DEBUG oslo_concurrency.lockutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "83177619-4b43-46b9-9411-0befd9730f0a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.136 186483 DEBUG nova.compute.manager [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.213 186483 DEBUG oslo_concurrency.lockutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.214 186483 DEBUG oslo_concurrency.lockutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.222 186483 DEBUG nova.virt.hardware [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.223 186483 INFO nova.compute.claims [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Claim successful on node compute-0.ctlplane.example.com
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.320 186483 DEBUG nova.compute.provider_tree [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.334 186483 DEBUG nova.scheduler.client.report [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.356 186483 DEBUG oslo_concurrency.lockutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.356 186483 DEBUG nova.compute.manager [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.411 186483 DEBUG nova.compute.manager [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.412 186483 DEBUG nova.network.neutron [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.437 186483 INFO nova.virt.libvirt.driver [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.463 186483 DEBUG nova.compute.manager [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.626 186483 DEBUG nova.compute.manager [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.628 186483 DEBUG nova.virt.libvirt.driver [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.629 186483 INFO nova.virt.libvirt.driver [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Creating image(s)
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.630 186483 DEBUG oslo_concurrency.lockutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "/var/lib/nova/instances/83177619-4b43-46b9-9411-0befd9730f0a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.630 186483 DEBUG oslo_concurrency.lockutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/83177619-4b43-46b9-9411-0befd9730f0a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.631 186483 DEBUG oslo_concurrency.lockutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/83177619-4b43-46b9-9411-0befd9730f0a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.661 186483 DEBUG oslo_concurrency.processutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:34:43.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.726 186483 DEBUG oslo_concurrency.processutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.728 186483 DEBUG oslo_concurrency.lockutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.729 186483 DEBUG oslo_concurrency.lockutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.751 186483 DEBUG oslo_concurrency.processutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.827 186483 DEBUG oslo_concurrency.processutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.828 186483 DEBUG oslo_concurrency.processutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/83177619-4b43-46b9-9411-0befd9730f0a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.853 186483 DEBUG nova.policy [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.871 186483 DEBUG oslo_concurrency.processutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/83177619-4b43-46b9-9411-0befd9730f0a/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.872 186483 DEBUG oslo_concurrency.lockutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.873 186483 DEBUG oslo_concurrency.processutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.916 186483 DEBUG oslo_concurrency.processutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.918 186483 DEBUG nova.virt.disk.api [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Checking if we can resize image /var/lib/nova/instances/83177619-4b43-46b9-9411-0befd9730f0a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.919 186483 DEBUG oslo_concurrency.processutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83177619-4b43-46b9-9411-0befd9730f0a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.981 186483 DEBUG oslo_concurrency.processutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/83177619-4b43-46b9-9411-0befd9730f0a/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.983 186483 DEBUG nova.virt.disk.api [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Cannot resize image /var/lib/nova/instances/83177619-4b43-46b9-9411-0befd9730f0a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.983 186483 DEBUG nova.objects.instance [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'migration_context' on Instance uuid 83177619-4b43-46b9-9411-0befd9730f0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.995 186483 DEBUG nova.virt.libvirt.driver [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.996 186483 DEBUG nova.virt.libvirt.driver [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Ensure instance console log exists: /var/lib/nova/instances/83177619-4b43-46b9-9411-0befd9730f0a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.996 186483 DEBUG oslo_concurrency.lockutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.997 186483 DEBUG oslo_concurrency.lockutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:43 compute-0 nova_compute[186479]: 2026-02-17 17:34:43.997 186483 DEBUG oslo_concurrency.lockutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:44 compute-0 nova_compute[186479]: 2026-02-17 17:34:44.530 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:45 compute-0 nova_compute[186479]: 2026-02-17 17:34:45.011 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:45 compute-0 podman[218738]: 2026-02-17 17:34:45.727762073 +0000 UTC m=+0.061096897 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 17 17:34:46 compute-0 nova_compute[186479]: 2026-02-17 17:34:46.345 186483 DEBUG nova.network.neutron [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Successfully updated port: 278d4f1d-34a9-436c-a8c6-104778e90a0f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 17 17:34:46 compute-0 nova_compute[186479]: 2026-02-17 17:34:46.360 186483 DEBUG oslo_concurrency.lockutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "refresh_cache-83177619-4b43-46b9-9411-0befd9730f0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:34:46 compute-0 nova_compute[186479]: 2026-02-17 17:34:46.361 186483 DEBUG oslo_concurrency.lockutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquired lock "refresh_cache-83177619-4b43-46b9-9411-0befd9730f0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:34:46 compute-0 nova_compute[186479]: 2026-02-17 17:34:46.361 186483 DEBUG nova.network.neutron [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 17 17:34:46 compute-0 nova_compute[186479]: 2026-02-17 17:34:46.453 186483 DEBUG nova.compute.manager [req-ef4149a1-56ef-4eae-8da1-01e25f38f94c req-e063528c-8fb6-4937-81ee-a6f629729f7d 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Received event network-changed-278d4f1d-34a9-436c-a8c6-104778e90a0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:34:46 compute-0 nova_compute[186479]: 2026-02-17 17:34:46.454 186483 DEBUG nova.compute.manager [req-ef4149a1-56ef-4eae-8da1-01e25f38f94c req-e063528c-8fb6-4937-81ee-a6f629729f7d 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Refreshing instance network info cache due to event network-changed-278d4f1d-34a9-436c-a8c6-104778e90a0f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:34:46 compute-0 nova_compute[186479]: 2026-02-17 17:34:46.454 186483 DEBUG oslo_concurrency.lockutils [req-ef4149a1-56ef-4eae-8da1-01e25f38f94c req-e063528c-8fb6-4937-81ee-a6f629729f7d 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-83177619-4b43-46b9-9411-0befd9730f0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:34:46 compute-0 nova_compute[186479]: 2026-02-17 17:34:46.524 186483 DEBUG nova.network.neutron [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.673 186483 DEBUG nova.network.neutron [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Updating instance_info_cache with network_info: [{"id": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "address": "fa:16:3e:c9:d0:d4", "network": {"id": "1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0", "bridge": "br-int", "label": "tempest-network-smoke--1447073493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap278d4f1d-34", "ovs_interfaceid": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.696 186483 DEBUG oslo_concurrency.lockutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Releasing lock "refresh_cache-83177619-4b43-46b9-9411-0befd9730f0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.696 186483 DEBUG nova.compute.manager [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Instance network_info: |[{"id": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "address": "fa:16:3e:c9:d0:d4", "network": {"id": "1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0", "bridge": "br-int", "label": "tempest-network-smoke--1447073493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap278d4f1d-34", "ovs_interfaceid": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.697 186483 DEBUG oslo_concurrency.lockutils [req-ef4149a1-56ef-4eae-8da1-01e25f38f94c req-e063528c-8fb6-4937-81ee-a6f629729f7d 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-83177619-4b43-46b9-9411-0befd9730f0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.697 186483 DEBUG nova.network.neutron [req-ef4149a1-56ef-4eae-8da1-01e25f38f94c req-e063528c-8fb6-4937-81ee-a6f629729f7d 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Refreshing network info cache for port 278d4f1d-34a9-436c-a8c6-104778e90a0f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.700 186483 DEBUG nova.virt.libvirt.driver [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Start _get_guest_xml network_info=[{"id": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "address": "fa:16:3e:c9:d0:d4", "network": {"id": "1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0", "bridge": "br-int", "label": "tempest-network-smoke--1447073493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap278d4f1d-34", "ovs_interfaceid": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_format': None, 'encryption_options': None, 'size': 0, 'encrypted': False, 'image_id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.705 186483 WARNING nova.virt.libvirt.driver [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.710 186483 DEBUG nova.virt.libvirt.host [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.711 186483 DEBUG nova.virt.libvirt.host [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.716 186483 DEBUG nova.virt.libvirt.host [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.717 186483 DEBUG nova.virt.libvirt.host [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.717 186483 DEBUG nova.virt.libvirt.driver [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.717 186483 DEBUG nova.virt.hardware [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-17T17:28:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='5ebd41ec-8360-4181-bce3-0c0dc586cdb2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.718 186483 DEBUG nova.virt.hardware [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.718 186483 DEBUG nova.virt.hardware [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.719 186483 DEBUG nova.virt.hardware [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.719 186483 DEBUG nova.virt.hardware [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.719 186483 DEBUG nova.virt.hardware [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.720 186483 DEBUG nova.virt.hardware [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.720 186483 DEBUG nova.virt.hardware [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.720 186483 DEBUG nova.virt.hardware [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.721 186483 DEBUG nova.virt.hardware [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.721 186483 DEBUG nova.virt.hardware [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.725 186483 DEBUG nova.virt.libvirt.vif [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:34:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1275674706',display_name='tempest-TestNetworkBasicOps-server-1275674706',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1275674706',id=9,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMv7Up2YZYKd5nngmEV7trL5OV/CkKTu/vVXssS+aQ5Oi5KAuyoFueWRxWCxrMytT8J1jaK1wRD4ttYrw2qgydmEgr+zYQCq+PQi+2gBRyNSxjrF76VuvY6ObHHunKEqUw==',key_name='tempest-TestNetworkBasicOps-2017223567',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-p6voabap',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:34:43Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=83177619-4b43-46b9-9411-0befd9730f0a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "address": "fa:16:3e:c9:d0:d4", "network": {"id": "1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0", "bridge": "br-int", "label": "tempest-network-smoke--1447073493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap278d4f1d-34", "ovs_interfaceid": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.726 186483 DEBUG nova.network.os_vif_util [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "address": "fa:16:3e:c9:d0:d4", "network": {"id": "1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0", "bridge": "br-int", "label": "tempest-network-smoke--1447073493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap278d4f1d-34", "ovs_interfaceid": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.727 186483 DEBUG nova.network.os_vif_util [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:d0:d4,bridge_name='br-int',has_traffic_filtering=True,id=278d4f1d-34a9-436c-a8c6-104778e90a0f,network=Network(1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap278d4f1d-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.728 186483 DEBUG nova.objects.instance [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'pci_devices' on Instance uuid 83177619-4b43-46b9-9411-0befd9730f0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.745 186483 DEBUG nova.virt.libvirt.driver [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] End _get_guest_xml xml=<domain type="kvm">
Feb 17 17:34:48 compute-0 nova_compute[186479]:   <uuid>83177619-4b43-46b9-9411-0befd9730f0a</uuid>
Feb 17 17:34:48 compute-0 nova_compute[186479]:   <name>instance-00000009</name>
Feb 17 17:34:48 compute-0 nova_compute[186479]:   <memory>131072</memory>
Feb 17 17:34:48 compute-0 nova_compute[186479]:   <vcpu>1</vcpu>
Feb 17 17:34:48 compute-0 nova_compute[186479]:   <metadata>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <nova:name>tempest-TestNetworkBasicOps-server-1275674706</nova:name>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <nova:creationTime>2026-02-17 17:34:48</nova:creationTime>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <nova:flavor name="m1.nano">
Feb 17 17:34:48 compute-0 nova_compute[186479]:         <nova:memory>128</nova:memory>
Feb 17 17:34:48 compute-0 nova_compute[186479]:         <nova:disk>1</nova:disk>
Feb 17 17:34:48 compute-0 nova_compute[186479]:         <nova:swap>0</nova:swap>
Feb 17 17:34:48 compute-0 nova_compute[186479]:         <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:34:48 compute-0 nova_compute[186479]:         <nova:vcpus>1</nova:vcpus>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       </nova:flavor>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <nova:owner>
Feb 17 17:34:48 compute-0 nova_compute[186479]:         <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:34:48 compute-0 nova_compute[186479]:         <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       </nova:owner>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <nova:ports>
Feb 17 17:34:48 compute-0 nova_compute[186479]:         <nova:port uuid="278d4f1d-34a9-436c-a8c6-104778e90a0f">
Feb 17 17:34:48 compute-0 nova_compute[186479]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:         </nova:port>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       </nova:ports>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     </nova:instance>
Feb 17 17:34:48 compute-0 nova_compute[186479]:   </metadata>
Feb 17 17:34:48 compute-0 nova_compute[186479]:   <sysinfo type="smbios">
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <system>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <entry name="manufacturer">RDO</entry>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <entry name="product">OpenStack Compute</entry>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <entry name="serial">83177619-4b43-46b9-9411-0befd9730f0a</entry>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <entry name="uuid">83177619-4b43-46b9-9411-0befd9730f0a</entry>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <entry name="family">Virtual Machine</entry>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     </system>
Feb 17 17:34:48 compute-0 nova_compute[186479]:   </sysinfo>
Feb 17 17:34:48 compute-0 nova_compute[186479]:   <os>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <boot dev="hd"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <smbios mode="sysinfo"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:   </os>
Feb 17 17:34:48 compute-0 nova_compute[186479]:   <features>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <acpi/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <apic/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <vmcoreinfo/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:   </features>
Feb 17 17:34:48 compute-0 nova_compute[186479]:   <clock offset="utc">
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <timer name="pit" tickpolicy="delay"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <timer name="hpet" present="no"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:   </clock>
Feb 17 17:34:48 compute-0 nova_compute[186479]:   <cpu mode="host-model" match="exact">
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <topology sockets="1" cores="1" threads="1"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:34:48 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <disk type="file" device="disk">
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/83177619-4b43-46b9-9411-0befd9730f0a/disk"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <target dev="vda" bus="virtio"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <disk type="file" device="cdrom">
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <driver name="qemu" type="raw" cache="none"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/83177619-4b43-46b9-9411-0befd9730f0a/disk.config"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <target dev="sda" bus="sata"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <interface type="ethernet">
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <mac address="fa:16:3e:c9:d0:d4"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <driver name="vhost" rx_queue_size="512"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <mtu size="1442"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <target dev="tap278d4f1d-34"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <serial type="pty">
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <log file="/var/lib/nova/instances/83177619-4b43-46b9-9411-0befd9730f0a/console.log" append="off"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     </serial>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <video>
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     </video>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <input type="tablet" bus="usb"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <rng model="virtio">
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <backend model="random">/dev/urandom</backend>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <controller type="usb" index="0"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     <memballoon model="virtio">
Feb 17 17:34:48 compute-0 nova_compute[186479]:       <stats period="10"/>
Feb 17 17:34:48 compute-0 nova_compute[186479]:     </memballoon>
Feb 17 17:34:48 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:34:48 compute-0 nova_compute[186479]: </domain>
Feb 17 17:34:48 compute-0 nova_compute[186479]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.746 186483 DEBUG nova.compute.manager [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Preparing to wait for external event network-vif-plugged-278d4f1d-34a9-436c-a8c6-104778e90a0f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.747 186483 DEBUG oslo_concurrency.lockutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "83177619-4b43-46b9-9411-0befd9730f0a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.747 186483 DEBUG oslo_concurrency.lockutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "83177619-4b43-46b9-9411-0befd9730f0a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.747 186483 DEBUG oslo_concurrency.lockutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "83177619-4b43-46b9-9411-0befd9730f0a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.749 186483 DEBUG nova.virt.libvirt.vif [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:34:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1275674706',display_name='tempest-TestNetworkBasicOps-server-1275674706',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1275674706',id=9,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMv7Up2YZYKd5nngmEV7trL5OV/CkKTu/vVXssS+aQ5Oi5KAuyoFueWRxWCxrMytT8J1jaK1wRD4ttYrw2qgydmEgr+zYQCq+PQi+2gBRyNSxjrF76VuvY6ObHHunKEqUw==',key_name='tempest-TestNetworkBasicOps-2017223567',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-p6voabap',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:34:43Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=83177619-4b43-46b9-9411-0befd9730f0a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "address": "fa:16:3e:c9:d0:d4", "network": {"id": "1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0", "bridge": "br-int", "label": "tempest-network-smoke--1447073493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap278d4f1d-34", "ovs_interfaceid": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.749 186483 DEBUG nova.network.os_vif_util [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "address": "fa:16:3e:c9:d0:d4", "network": {"id": "1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0", "bridge": "br-int", "label": "tempest-network-smoke--1447073493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap278d4f1d-34", "ovs_interfaceid": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.750 186483 DEBUG nova.network.os_vif_util [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:d0:d4,bridge_name='br-int',has_traffic_filtering=True,id=278d4f1d-34a9-436c-a8c6-104778e90a0f,network=Network(1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap278d4f1d-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.751 186483 DEBUG os_vif [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:d0:d4,bridge_name='br-int',has_traffic_filtering=True,id=278d4f1d-34a9-436c-a8c6-104778e90a0f,network=Network(1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap278d4f1d-34') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.751 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.752 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.752 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.755 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.756 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap278d4f1d-34, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.756 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap278d4f1d-34, col_values=(('external_ids', {'iface-id': '278d4f1d-34a9-436c-a8c6-104778e90a0f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c9:d0:d4', 'vm-uuid': '83177619-4b43-46b9-9411-0befd9730f0a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.758 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:48 compute-0 NetworkManager[56323]: <info>  [1771349688.7591] manager: (tap278d4f1d-34): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.760 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.764 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.765 186483 INFO os_vif [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:d0:d4,bridge_name='br-int',has_traffic_filtering=True,id=278d4f1d-34a9-436c-a8c6-104778e90a0f,network=Network(1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap278d4f1d-34')
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.810 186483 DEBUG nova.virt.libvirt.driver [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.811 186483 DEBUG nova.virt.libvirt.driver [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.811 186483 DEBUG nova.virt.libvirt.driver [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No VIF found with MAC fa:16:3e:c9:d0:d4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 17 17:34:48 compute-0 nova_compute[186479]: 2026-02-17 17:34:48.812 186483 INFO nova.virt.libvirt.driver [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Using config drive
Feb 17 17:34:49 compute-0 nova_compute[186479]: 2026-02-17 17:34:49.508 186483 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771349674.5077984, 5b203313-4460-4f2b-b0d0-29e1846079ed => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:34:49 compute-0 nova_compute[186479]: 2026-02-17 17:34:49.509 186483 INFO nova.compute.manager [-] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] VM Stopped (Lifecycle Event)
Feb 17 17:34:49 compute-0 nova_compute[186479]: 2026-02-17 17:34:49.529 186483 DEBUG nova.compute.manager [None req-fc2d2574-404d-4353-9cd3-bb429061bc24 - - - - - -] [instance: 5b203313-4460-4f2b-b0d0-29e1846079ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:34:49 compute-0 nova_compute[186479]: 2026-02-17 17:34:49.648 186483 INFO nova.virt.libvirt.driver [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Creating config drive at /var/lib/nova/instances/83177619-4b43-46b9-9411-0befd9730f0a/disk.config
Feb 17 17:34:49 compute-0 nova_compute[186479]: 2026-02-17 17:34:49.652 186483 DEBUG oslo_concurrency.processutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/83177619-4b43-46b9-9411-0befd9730f0a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpdedy_1rg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:34:49 compute-0 nova_compute[186479]: 2026-02-17 17:34:49.770 186483 DEBUG oslo_concurrency.processutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/83177619-4b43-46b9-9411-0befd9730f0a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpdedy_1rg" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:34:49 compute-0 kernel: tap278d4f1d-34: entered promiscuous mode
Feb 17 17:34:49 compute-0 NetworkManager[56323]: <info>  [1771349689.8351] manager: (tap278d4f1d-34): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Feb 17 17:34:49 compute-0 ovn_controller[96568]: 2026-02-17T17:34:49Z|00126|binding|INFO|Claiming lport 278d4f1d-34a9-436c-a8c6-104778e90a0f for this chassis.
Feb 17 17:34:49 compute-0 ovn_controller[96568]: 2026-02-17T17:34:49Z|00127|binding|INFO|278d4f1d-34a9-436c-a8c6-104778e90a0f: Claiming fa:16:3e:c9:d0:d4 10.100.0.13
Feb 17 17:34:49 compute-0 nova_compute[186479]: 2026-02-17 17:34:49.836 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:49.843 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:d0:d4 10.100.0.13'], port_security=['fa:16:3e:c9:d0:d4 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1110780057', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '83177619-4b43-46b9-9411-0befd9730f0a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1110780057', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '7', 'neutron:security_group_ids': '4ea7d121-7f67-4d41-b55a-38229e1e4d1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.215'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80559a54-f2f5-4114-9204-16477dc05b22, chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=278d4f1d-34a9-436c-a8c6-104778e90a0f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:34:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:49.844 105898 INFO neutron.agent.ovn.metadata.agent [-] Port 278d4f1d-34a9-436c-a8c6-104778e90a0f in datapath 1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0 bound to our chassis
Feb 17 17:34:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:49.845 105898 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0
Feb 17 17:34:49 compute-0 ovn_controller[96568]: 2026-02-17T17:34:49Z|00128|binding|INFO|Setting lport 278d4f1d-34a9-436c-a8c6-104778e90a0f up in Southbound
Feb 17 17:34:49 compute-0 ovn_controller[96568]: 2026-02-17T17:34:49Z|00129|binding|INFO|Setting lport 278d4f1d-34a9-436c-a8c6-104778e90a0f ovn-installed in OVS
Feb 17 17:34:49 compute-0 nova_compute[186479]: 2026-02-17 17:34:49.847 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:49 compute-0 nova_compute[186479]: 2026-02-17 17:34:49.853 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:49.856 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[2f1d4d2d-9257-432c-951c-a855fad386df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:49.857 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1551a1a1-41 in ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 17 17:34:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:49.859 215208 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1551a1a1-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 17 17:34:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:49.859 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[3693607d-c3fc-40d4-be67-7f6828abb927]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:49.860 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[3d55857b-6769-418d-a23c-a18cd0c5bd40]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:49.867 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[9280b38c-30a3-4b43-9c65-99f5d5952e55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:49 compute-0 systemd-udevd[218795]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:34:49 compute-0 systemd-machined[155877]: New machine qemu-9-instance-00000009.
Feb 17 17:34:49 compute-0 NetworkManager[56323]: <info>  [1771349689.8833] device (tap278d4f1d-34): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 17 17:34:49 compute-0 NetworkManager[56323]: <info>  [1771349689.8837] device (tap278d4f1d-34): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 17 17:34:49 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-00000009.
Feb 17 17:34:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:49.887 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[e8d62959-bcf2-4a50-b644-a30a7721d2d9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:49.907 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[ee7257ad-2e1f-41ae-98e4-96e7423fb36e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:49 compute-0 NetworkManager[56323]: <info>  [1771349689.9124] manager: (tap1551a1a1-40): new Veth device (/org/freedesktop/NetworkManager/Devices/70)
Feb 17 17:34:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:49.912 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[c8fcd8bf-dfe9-4250-89a5-b090b8d516f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:49 compute-0 podman[218777]: 2026-02-17 17:34:49.914538781 +0000 UTC m=+0.076555828 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1770267347, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 17 17:34:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:49.934 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[c16b6feb-343a-4b14-b979-222c34c1b754]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:49.937 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[036c4081-1fb3-4312-b62e-6768024931dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:49 compute-0 NetworkManager[56323]: <info>  [1771349689.9502] device (tap1551a1a1-40): carrier: link connected
Feb 17 17:34:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:49.952 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[66bf2fb9-63f5-4271-bc28-85f50e15ef2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:49.964 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[0beacc46-8160-4da5-ad35-9e5ef99ecd63]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1551a1a1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:a0:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 344526, 'reachable_time': 19671, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218836, 'error': None, 'target': 'ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:49.975 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[2e607812-f47f-40fc-9676-d571c2d44d30]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed4:a0f5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 344526, 'tstamp': 344526}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218837, 'error': None, 'target': 'ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:49 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:49.986 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[888d3521-a1e1-4d81-8b0d-0f34c7516878]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1551a1a1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d4:a0:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 344526, 'reachable_time': 19671, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218838, 'error': None, 'target': 'ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:50.003 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[1b2ee10b-d2be-48f4-add8-58f2d80075ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.012 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:50.048 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[ae20ceec-748c-45e4-a760-c7525d26b384]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:50.049 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1551a1a1-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:50.049 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:50.050 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1551a1a1-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.053 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:50 compute-0 NetworkManager[56323]: <info>  [1771349690.0548] manager: (tap1551a1a1-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Feb 17 17:34:50 compute-0 kernel: tap1551a1a1-40: entered promiscuous mode
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.056 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:50.056 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1551a1a1-40, col_values=(('external_ids', {'iface-id': '694e808e-a1c9-4900-a806-bd631fbb0446'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:34:50 compute-0 ovn_controller[96568]: 2026-02-17T17:34:50Z|00130|binding|INFO|Releasing lport 694e808e-a1c9-4900-a806-bd631fbb0446 from this chassis (sb_readonly=0)
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.057 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.061 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:50.062 105898 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:50.062 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[efbbf83b-9dfe-4812-ba9b-6ce5b45247cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:50.063 105898 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]: global
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:     log         /dev/log local0 debug
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:     log-tag     haproxy-metadata-proxy-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:     user        root
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:     group       root
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:     maxconn     1024
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:     pidfile     /var/lib/neutron/external/pids/1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0.pid.haproxy
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:     daemon
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]: defaults
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:     log global
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:     mode http
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:     option httplog
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:     option dontlognull
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:     option http-server-close
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:     option forwardfor
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:     retries                 3
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:     timeout http-request    30s
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:     timeout connect         30s
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:     timeout client          32s
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:     timeout server          32s
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:     timeout http-keep-alive 30s
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]: listen listener
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:     bind 169.254.169.254:80
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:     server metadata /var/lib/neutron/metadata_proxy
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:     http-request add-header X-OVN-Network-ID 1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 17 17:34:50 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:50.063 105898 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0', 'env', 'PROCESS_TAG=haproxy-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.143 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349690.1424165, 83177619-4b43-46b9-9411-0befd9730f0a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.143 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] VM Started (Lifecycle Event)
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.164 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.168 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349690.1426466, 83177619-4b43-46b9-9411-0befd9730f0a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.168 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] VM Paused (Lifecycle Event)
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.200 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.203 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.233 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:34:50 compute-0 podman[218877]: 2026-02-17 17:34:50.351174038 +0000 UTC m=+0.033954271 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 17 17:34:50 compute-0 podman[218877]: 2026-02-17 17:34:50.44810004 +0000 UTC m=+0.130880223 container create 4702e66d888f02943256729537c386337a4471ec5202cfae989c59fd02973705 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 17 17:34:50 compute-0 systemd[1]: Started libpod-conmon-4702e66d888f02943256729537c386337a4471ec5202cfae989c59fd02973705.scope.
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.527 186483 DEBUG nova.compute.manager [req-fe54d293-0129-444c-a004-299aded01b68 req-379e4856-2035-4be9-bd1f-847d0eef8979 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Received event network-vif-plugged-278d4f1d-34a9-436c-a8c6-104778e90a0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.528 186483 DEBUG oslo_concurrency.lockutils [req-fe54d293-0129-444c-a004-299aded01b68 req-379e4856-2035-4be9-bd1f-847d0eef8979 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "83177619-4b43-46b9-9411-0befd9730f0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.528 186483 DEBUG oslo_concurrency.lockutils [req-fe54d293-0129-444c-a004-299aded01b68 req-379e4856-2035-4be9-bd1f-847d0eef8979 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "83177619-4b43-46b9-9411-0befd9730f0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.529 186483 DEBUG oslo_concurrency.lockutils [req-fe54d293-0129-444c-a004-299aded01b68 req-379e4856-2035-4be9-bd1f-847d0eef8979 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "83177619-4b43-46b9-9411-0befd9730f0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.529 186483 DEBUG nova.compute.manager [req-fe54d293-0129-444c-a004-299aded01b68 req-379e4856-2035-4be9-bd1f-847d0eef8979 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Processing event network-vif-plugged-278d4f1d-34a9-436c-a8c6-104778e90a0f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.530 186483 DEBUG nova.compute.manager [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.535 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349690.5355728, 83177619-4b43-46b9-9411-0befd9730f0a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.536 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] VM Resumed (Lifecycle Event)
Feb 17 17:34:50 compute-0 systemd[1]: Started libcrun container.
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.540 186483 DEBUG nova.virt.libvirt.driver [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 17 17:34:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/61f11de9511b40794d89928f1ff4896dea975476f4e168110bee2b767f127ea2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.555 186483 INFO nova.virt.libvirt.driver [-] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Instance spawned successfully.
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.557 186483 DEBUG nova.virt.libvirt.driver [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 17 17:34:50 compute-0 podman[218877]: 2026-02-17 17:34:50.568229096 +0000 UTC m=+0.251009289 container init 4702e66d888f02943256729537c386337a4471ec5202cfae989c59fd02973705 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.572 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:34:50 compute-0 podman[218877]: 2026-02-17 17:34:50.573949672 +0000 UTC m=+0.256729845 container start 4702e66d888f02943256729537c386337a4471ec5202cfae989c59fd02973705 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.588 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.592 186483 DEBUG nova.virt.libvirt.driver [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.592 186483 DEBUG nova.virt.libvirt.driver [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.593 186483 DEBUG nova.virt.libvirt.driver [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.593 186483 DEBUG nova.virt.libvirt.driver [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.594 186483 DEBUG nova.virt.libvirt.driver [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.595 186483 DEBUG nova.virt.libvirt.driver [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:34:50 compute-0 neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0[218893]: [NOTICE]   (218897) : New worker (218899) forked
Feb 17 17:34:50 compute-0 neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0[218893]: [NOTICE]   (218897) : Loading success.
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.608 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.654 186483 INFO nova.compute.manager [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Took 7.03 seconds to spawn the instance on the hypervisor.
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.654 186483 DEBUG nova.compute.manager [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.711 186483 INFO nova.compute.manager [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Took 7.53 seconds to build instance.
Feb 17 17:34:50 compute-0 nova_compute[186479]: 2026-02-17 17:34:50.729 186483 DEBUG oslo_concurrency.lockutils [None req-6e6e8414-1ccc-4724-8e2c-482a4486ceb2 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "83177619-4b43-46b9-9411-0befd9730f0a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:51 compute-0 nova_compute[186479]: 2026-02-17 17:34:51.541 186483 DEBUG nova.network.neutron [req-ef4149a1-56ef-4eae-8da1-01e25f38f94c req-e063528c-8fb6-4937-81ee-a6f629729f7d 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Updated VIF entry in instance network info cache for port 278d4f1d-34a9-436c-a8c6-104778e90a0f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:34:51 compute-0 nova_compute[186479]: 2026-02-17 17:34:51.542 186483 DEBUG nova.network.neutron [req-ef4149a1-56ef-4eae-8da1-01e25f38f94c req-e063528c-8fb6-4937-81ee-a6f629729f7d 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Updating instance_info_cache with network_info: [{"id": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "address": "fa:16:3e:c9:d0:d4", "network": {"id": "1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0", "bridge": "br-int", "label": "tempest-network-smoke--1447073493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap278d4f1d-34", "ovs_interfaceid": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:34:51 compute-0 nova_compute[186479]: 2026-02-17 17:34:51.560 186483 DEBUG oslo_concurrency.lockutils [req-ef4149a1-56ef-4eae-8da1-01e25f38f94c req-e063528c-8fb6-4937-81ee-a6f629729f7d 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-83177619-4b43-46b9-9411-0befd9730f0a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.613 186483 DEBUG nova.compute.manager [req-dd1aa76b-a922-403b-a3b4-437fd54e81b2 req-53bd2f38-61cd-40d0-a642-e52234bfaa0c 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Received event network-vif-plugged-278d4f1d-34a9-436c-a8c6-104778e90a0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.614 186483 DEBUG oslo_concurrency.lockutils [req-dd1aa76b-a922-403b-a3b4-437fd54e81b2 req-53bd2f38-61cd-40d0-a642-e52234bfaa0c 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "83177619-4b43-46b9-9411-0befd9730f0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.614 186483 DEBUG oslo_concurrency.lockutils [req-dd1aa76b-a922-403b-a3b4-437fd54e81b2 req-53bd2f38-61cd-40d0-a642-e52234bfaa0c 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "83177619-4b43-46b9-9411-0befd9730f0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.614 186483 DEBUG oslo_concurrency.lockutils [req-dd1aa76b-a922-403b-a3b4-437fd54e81b2 req-53bd2f38-61cd-40d0-a642-e52234bfaa0c 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "83177619-4b43-46b9-9411-0befd9730f0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.614 186483 DEBUG nova.compute.manager [req-dd1aa76b-a922-403b-a3b4-437fd54e81b2 req-53bd2f38-61cd-40d0-a642-e52234bfaa0c 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] No waiting events found dispatching network-vif-plugged-278d4f1d-34a9-436c-a8c6-104778e90a0f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.615 186483 WARNING nova.compute.manager [req-dd1aa76b-a922-403b-a3b4-437fd54e81b2 req-53bd2f38-61cd-40d0-a642-e52234bfaa0c 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Received unexpected event network-vif-plugged-278d4f1d-34a9-436c-a8c6-104778e90a0f for instance with vm_state active and task_state deleting.
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.657 186483 DEBUG oslo_concurrency.lockutils [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "83177619-4b43-46b9-9411-0befd9730f0a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.657 186483 DEBUG oslo_concurrency.lockutils [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "83177619-4b43-46b9-9411-0befd9730f0a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.658 186483 DEBUG oslo_concurrency.lockutils [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "83177619-4b43-46b9-9411-0befd9730f0a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.658 186483 DEBUG oslo_concurrency.lockutils [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "83177619-4b43-46b9-9411-0befd9730f0a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.658 186483 DEBUG oslo_concurrency.lockutils [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "83177619-4b43-46b9-9411-0befd9730f0a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.659 186483 INFO nova.compute.manager [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Terminating instance
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.660 186483 DEBUG nova.compute.manager [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 17 17:34:52 compute-0 kernel: tap278d4f1d-34 (unregistering): left promiscuous mode
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.695 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:52 compute-0 NetworkManager[56323]: <info>  [1771349692.6968] device (tap278d4f1d-34): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 17 17:34:52 compute-0 ovn_controller[96568]: 2026-02-17T17:34:52Z|00131|binding|INFO|Releasing lport 278d4f1d-34a9-436c-a8c6-104778e90a0f from this chassis (sb_readonly=0)
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.703 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:52 compute-0 ovn_controller[96568]: 2026-02-17T17:34:52Z|00132|binding|INFO|Setting lport 278d4f1d-34a9-436c-a8c6-104778e90a0f down in Southbound
Feb 17 17:34:52 compute-0 ovn_controller[96568]: 2026-02-17T17:34:52Z|00133|binding|INFO|Removing iface tap278d4f1d-34 ovn-installed in OVS
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.711 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:52.712 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:d0:d4 10.100.0.13'], port_security=['fa:16:3e:c9:d0:d4 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1110780057', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '83177619-4b43-46b9-9411-0befd9730f0a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1110780057', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '9', 'neutron:security_group_ids': '4ea7d121-7f67-4d41-b55a-38229e1e4d1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.215', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80559a54-f2f5-4114-9204-16477dc05b22, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=278d4f1d-34a9-436c-a8c6-104778e90a0f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:34:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:52.713 105898 INFO neutron.agent.ovn.metadata.agent [-] Port 278d4f1d-34a9-436c-a8c6-104778e90a0f in datapath 1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0 unbound from our chassis
Feb 17 17:34:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:52.714 105898 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 17 17:34:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:52.716 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[b2c90af0-e472-42ab-a947-0bf3e24a907f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:52.717 105898 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0 namespace which is not needed anymore
Feb 17 17:34:52 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Deactivated successfully.
Feb 17 17:34:52 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Consumed 2.384s CPU time.
Feb 17 17:34:52 compute-0 systemd-machined[155877]: Machine qemu-9-instance-00000009 terminated.
Feb 17 17:34:52 compute-0 neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0[218893]: [NOTICE]   (218897) : haproxy version is 2.8.14-c23fe91
Feb 17 17:34:52 compute-0 neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0[218893]: [NOTICE]   (218897) : path to executable is /usr/sbin/haproxy
Feb 17 17:34:52 compute-0 neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0[218893]: [WARNING]  (218897) : Exiting Master process...
Feb 17 17:34:52 compute-0 neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0[218893]: [ALERT]    (218897) : Current worker (218899) exited with code 143 (Terminated)
Feb 17 17:34:52 compute-0 neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0[218893]: [WARNING]  (218897) : All workers exited. Exiting... (0)
Feb 17 17:34:52 compute-0 systemd[1]: libpod-4702e66d888f02943256729537c386337a4471ec5202cfae989c59fd02973705.scope: Deactivated successfully.
Feb 17 17:34:52 compute-0 podman[218929]: 2026-02-17 17:34:52.844970632 +0000 UTC m=+0.040369155 container died 4702e66d888f02943256729537c386337a4471ec5202cfae989c59fd02973705 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 17 17:34:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4702e66d888f02943256729537c386337a4471ec5202cfae989c59fd02973705-userdata-shm.mount: Deactivated successfully.
Feb 17 17:34:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-61f11de9511b40794d89928f1ff4896dea975476f4e168110bee2b767f127ea2-merged.mount: Deactivated successfully.
Feb 17 17:34:52 compute-0 podman[218929]: 2026-02-17 17:34:52.891235985 +0000 UTC m=+0.086634498 container cleanup 4702e66d888f02943256729537c386337a4471ec5202cfae989c59fd02973705 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 17 17:34:52 compute-0 systemd[1]: libpod-conmon-4702e66d888f02943256729537c386337a4471ec5202cfae989c59fd02973705.scope: Deactivated successfully.
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.910 186483 INFO nova.virt.libvirt.driver [-] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Instance destroyed successfully.
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.911 186483 DEBUG nova.objects.instance [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'resources' on Instance uuid 83177619-4b43-46b9-9411-0befd9730f0a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.928 186483 DEBUG nova.virt.libvirt.vif [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:34:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1275674706',display_name='tempest-TestNetworkBasicOps-server-1275674706',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1275674706',id=9,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMv7Up2YZYKd5nngmEV7trL5OV/CkKTu/vVXssS+aQ5Oi5KAuyoFueWRxWCxrMytT8J1jaK1wRD4ttYrw2qgydmEgr+zYQCq+PQi+2gBRyNSxjrF76VuvY6ObHHunKEqUw==',key_name='tempest-TestNetworkBasicOps-2017223567',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:34:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-p6voabap',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:34:50Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=83177619-4b43-46b9-9411-0befd9730f0a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "address": "fa:16:3e:c9:d0:d4", "network": {"id": "1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0", "bridge": "br-int", "label": "tempest-network-smoke--1447073493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap278d4f1d-34", "ovs_interfaceid": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.929 186483 DEBUG nova.network.os_vif_util [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "address": "fa:16:3e:c9:d0:d4", "network": {"id": "1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0", "bridge": "br-int", "label": "tempest-network-smoke--1447073493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap278d4f1d-34", "ovs_interfaceid": "278d4f1d-34a9-436c-a8c6-104778e90a0f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.930 186483 DEBUG nova.network.os_vif_util [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:d0:d4,bridge_name='br-int',has_traffic_filtering=True,id=278d4f1d-34a9-436c-a8c6-104778e90a0f,network=Network(1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap278d4f1d-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.930 186483 DEBUG os_vif [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:d0:d4,bridge_name='br-int',has_traffic_filtering=True,id=278d4f1d-34a9-436c-a8c6-104778e90a0f,network=Network(1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap278d4f1d-34') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.933 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.933 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap278d4f1d-34, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.934 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.936 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.938 186483 INFO os_vif [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:d0:d4,bridge_name='br-int',has_traffic_filtering=True,id=278d4f1d-34a9-436c-a8c6-104778e90a0f,network=Network(1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap278d4f1d-34')
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.939 186483 INFO nova.virt.libvirt.driver [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Deleting instance files /var/lib/nova/instances/83177619-4b43-46b9-9411-0befd9730f0a_del
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.939 186483 INFO nova.virt.libvirt.driver [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Deletion of /var/lib/nova/instances/83177619-4b43-46b9-9411-0befd9730f0a_del complete
Feb 17 17:34:52 compute-0 podman[218967]: 2026-02-17 17:34:52.960838176 +0000 UTC m=+0.047126196 container remove 4702e66d888f02943256729537c386337a4471ec5202cfae989c59fd02973705 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:34:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:52.965 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[a57e7a1a-8ff0-4518-b030-801f7475bf20]: (4, ('Tue Feb 17 05:34:52 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0 (4702e66d888f02943256729537c386337a4471ec5202cfae989c59fd02973705)\n4702e66d888f02943256729537c386337a4471ec5202cfae989c59fd02973705\nTue Feb 17 05:34:52 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0 (4702e66d888f02943256729537c386337a4471ec5202cfae989c59fd02973705)\n4702e66d888f02943256729537c386337a4471ec5202cfae989c59fd02973705\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:52.966 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[704e351d-1913-47fe-9de2-4f90908ed010]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:52.967 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1551a1a1-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:34:52 compute-0 kernel: tap1551a1a1-40: left promiscuous mode
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.969 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.972 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:52 compute-0 nova_compute[186479]: 2026-02-17 17:34:52.972 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:52.974 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[abc37f52-2071-46df-be0f-f4b2e1159ccd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:52.987 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[e17c409b-4216-4636-bf9c-8a5712d6e38b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:52.989 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[b188646d-e68e-4aa9-9951-28be3854e42a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:53 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:52.999 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[a130a027-f6fd-4799-b01a-19352c0cc1be]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 344521, 'reachable_time': 31646, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218993, 'error': None, 'target': 'ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:53 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:53.001 106424 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1551a1a1-4fd7-4ca1-bc7e-1ca283b384c0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 17 17:34:53 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:34:53.002 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[44c0db67-71e9-478f-bcb8-709b2590fc7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:34:53 compute-0 systemd[1]: run-netns-ovnmeta\x2d1551a1a1\x2d4fd7\x2d4ca1\x2dbc7e\x2d1ca283b384c0.mount: Deactivated successfully.
Feb 17 17:34:53 compute-0 nova_compute[186479]: 2026-02-17 17:34:53.007 186483 INFO nova.compute.manager [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Took 0.35 seconds to destroy the instance on the hypervisor.
Feb 17 17:34:53 compute-0 nova_compute[186479]: 2026-02-17 17:34:53.008 186483 DEBUG oslo.service.loopingcall [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 17 17:34:53 compute-0 nova_compute[186479]: 2026-02-17 17:34:53.009 186483 DEBUG nova.compute.manager [-] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 17 17:34:53 compute-0 nova_compute[186479]: 2026-02-17 17:34:53.010 186483 DEBUG nova.network.neutron [-] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 17 17:34:54 compute-0 nova_compute[186479]: 2026-02-17 17:34:54.692 186483 DEBUG nova.compute.manager [req-3a9fd8ea-59a6-4f11-8485-461cee76f2c9 req-c3ad714e-0ecc-48e5-9ff9-6417eb11e079 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Received event network-vif-unplugged-278d4f1d-34a9-436c-a8c6-104778e90a0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:34:54 compute-0 nova_compute[186479]: 2026-02-17 17:34:54.693 186483 DEBUG oslo_concurrency.lockutils [req-3a9fd8ea-59a6-4f11-8485-461cee76f2c9 req-c3ad714e-0ecc-48e5-9ff9-6417eb11e079 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "83177619-4b43-46b9-9411-0befd9730f0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:54 compute-0 nova_compute[186479]: 2026-02-17 17:34:54.694 186483 DEBUG oslo_concurrency.lockutils [req-3a9fd8ea-59a6-4f11-8485-461cee76f2c9 req-c3ad714e-0ecc-48e5-9ff9-6417eb11e079 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "83177619-4b43-46b9-9411-0befd9730f0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:54 compute-0 nova_compute[186479]: 2026-02-17 17:34:54.694 186483 DEBUG oslo_concurrency.lockutils [req-3a9fd8ea-59a6-4f11-8485-461cee76f2c9 req-c3ad714e-0ecc-48e5-9ff9-6417eb11e079 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "83177619-4b43-46b9-9411-0befd9730f0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:54 compute-0 nova_compute[186479]: 2026-02-17 17:34:54.695 186483 DEBUG nova.compute.manager [req-3a9fd8ea-59a6-4f11-8485-461cee76f2c9 req-c3ad714e-0ecc-48e5-9ff9-6417eb11e079 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] No waiting events found dispatching network-vif-unplugged-278d4f1d-34a9-436c-a8c6-104778e90a0f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:34:54 compute-0 nova_compute[186479]: 2026-02-17 17:34:54.695 186483 DEBUG nova.compute.manager [req-3a9fd8ea-59a6-4f11-8485-461cee76f2c9 req-c3ad714e-0ecc-48e5-9ff9-6417eb11e079 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Received event network-vif-unplugged-278d4f1d-34a9-436c-a8c6-104778e90a0f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 17 17:34:54 compute-0 nova_compute[186479]: 2026-02-17 17:34:54.695 186483 DEBUG nova.compute.manager [req-3a9fd8ea-59a6-4f11-8485-461cee76f2c9 req-c3ad714e-0ecc-48e5-9ff9-6417eb11e079 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Received event network-vif-plugged-278d4f1d-34a9-436c-a8c6-104778e90a0f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:34:54 compute-0 nova_compute[186479]: 2026-02-17 17:34:54.696 186483 DEBUG oslo_concurrency.lockutils [req-3a9fd8ea-59a6-4f11-8485-461cee76f2c9 req-c3ad714e-0ecc-48e5-9ff9-6417eb11e079 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "83177619-4b43-46b9-9411-0befd9730f0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:54 compute-0 nova_compute[186479]: 2026-02-17 17:34:54.696 186483 DEBUG oslo_concurrency.lockutils [req-3a9fd8ea-59a6-4f11-8485-461cee76f2c9 req-c3ad714e-0ecc-48e5-9ff9-6417eb11e079 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "83177619-4b43-46b9-9411-0befd9730f0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:54 compute-0 nova_compute[186479]: 2026-02-17 17:34:54.697 186483 DEBUG oslo_concurrency.lockutils [req-3a9fd8ea-59a6-4f11-8485-461cee76f2c9 req-c3ad714e-0ecc-48e5-9ff9-6417eb11e079 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "83177619-4b43-46b9-9411-0befd9730f0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:54 compute-0 nova_compute[186479]: 2026-02-17 17:34:54.697 186483 DEBUG nova.compute.manager [req-3a9fd8ea-59a6-4f11-8485-461cee76f2c9 req-c3ad714e-0ecc-48e5-9ff9-6417eb11e079 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] No waiting events found dispatching network-vif-plugged-278d4f1d-34a9-436c-a8c6-104778e90a0f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:34:54 compute-0 nova_compute[186479]: 2026-02-17 17:34:54.697 186483 WARNING nova.compute.manager [req-3a9fd8ea-59a6-4f11-8485-461cee76f2c9 req-c3ad714e-0ecc-48e5-9ff9-6417eb11e079 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Received unexpected event network-vif-plugged-278d4f1d-34a9-436c-a8c6-104778e90a0f for instance with vm_state active and task_state deleting.
Feb 17 17:34:54 compute-0 nova_compute[186479]: 2026-02-17 17:34:54.824 186483 DEBUG nova.network.neutron [-] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:34:54 compute-0 nova_compute[186479]: 2026-02-17 17:34:54.842 186483 INFO nova.compute.manager [-] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Took 1.83 seconds to deallocate network for instance.
Feb 17 17:34:54 compute-0 nova_compute[186479]: 2026-02-17 17:34:54.886 186483 DEBUG oslo_concurrency.lockutils [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:34:54 compute-0 nova_compute[186479]: 2026-02-17 17:34:54.887 186483 DEBUG oslo_concurrency.lockutils [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:34:54 compute-0 nova_compute[186479]: 2026-02-17 17:34:54.938 186483 DEBUG nova.compute.provider_tree [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:34:54 compute-0 nova_compute[186479]: 2026-02-17 17:34:54.954 186483 DEBUG nova.scheduler.client.report [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:34:54 compute-0 nova_compute[186479]: 2026-02-17 17:34:54.975 186483 DEBUG oslo_concurrency.lockutils [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:55 compute-0 nova_compute[186479]: 2026-02-17 17:34:55.000 186483 INFO nova.scheduler.client.report [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Deleted allocations for instance 83177619-4b43-46b9-9411-0befd9730f0a
Feb 17 17:34:55 compute-0 nova_compute[186479]: 2026-02-17 17:34:55.055 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:55 compute-0 nova_compute[186479]: 2026-02-17 17:34:55.071 186483 DEBUG oslo_concurrency.lockutils [None req-b6542382-1a25-45eb-89d5-0f167cc31ae3 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "83177619-4b43-46b9-9411-0befd9730f0a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.413s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:34:57 compute-0 podman[218994]: 2026-02-17 17:34:57.734003408 +0000 UTC m=+0.071262681 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:34:57 compute-0 podman[218995]: 2026-02-17 17:34:57.768782618 +0000 UTC m=+0.104292930 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Feb 17 17:34:57 compute-0 nova_compute[186479]: 2026-02-17 17:34:57.936 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:34:59 compute-0 nova_compute[186479]: 2026-02-17 17:34:59.327 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:35:00 compute-0 nova_compute[186479]: 2026-02-17 17:35:00.056 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:00 compute-0 nova_compute[186479]: 2026-02-17 17:35:00.103 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:00 compute-0 nova_compute[186479]: 2026-02-17 17:35:00.143 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:01 compute-0 nova_compute[186479]: 2026-02-17 17:35:01.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:35:01 compute-0 nova_compute[186479]: 2026-02-17 17:35:01.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:35:02 compute-0 nova_compute[186479]: 2026-02-17 17:35:02.939 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:03 compute-0 nova_compute[186479]: 2026-02-17 17:35:03.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:35:03 compute-0 nova_compute[186479]: 2026-02-17 17:35:03.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:35:03 compute-0 podman[219033]: 2026-02-17 17:35:03.712372584 +0000 UTC m=+0.045987748 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 17 17:35:04 compute-0 nova_compute[186479]: 2026-02-17 17:35:04.298 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:35:04 compute-0 nova_compute[186479]: 2026-02-17 17:35:04.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:35:04 compute-0 nova_compute[186479]: 2026-02-17 17:35:04.303 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:35:04 compute-0 nova_compute[186479]: 2026-02-17 17:35:04.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:35:04 compute-0 nova_compute[186479]: 2026-02-17 17:35:04.326 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:35:04 compute-0 nova_compute[186479]: 2026-02-17 17:35:04.326 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:35:04 compute-0 nova_compute[186479]: 2026-02-17 17:35:04.326 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:35:04 compute-0 nova_compute[186479]: 2026-02-17 17:35:04.327 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:35:04 compute-0 nova_compute[186479]: 2026-02-17 17:35:04.479 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:35:04 compute-0 nova_compute[186479]: 2026-02-17 17:35:04.480 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5708MB free_disk=73.20707321166992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:35:04 compute-0 nova_compute[186479]: 2026-02-17 17:35:04.480 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:35:04 compute-0 nova_compute[186479]: 2026-02-17 17:35:04.481 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:35:04 compute-0 nova_compute[186479]: 2026-02-17 17:35:04.569 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:35:04 compute-0 nova_compute[186479]: 2026-02-17 17:35:04.569 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:35:04 compute-0 nova_compute[186479]: 2026-02-17 17:35:04.592 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:35:04 compute-0 nova_compute[186479]: 2026-02-17 17:35:04.612 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:35:04 compute-0 nova_compute[186479]: 2026-02-17 17:35:04.640 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:35:04 compute-0 nova_compute[186479]: 2026-02-17 17:35:04.640 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:35:05 compute-0 nova_compute[186479]: 2026-02-17 17:35:05.105 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:07 compute-0 nova_compute[186479]: 2026-02-17 17:35:07.909 186483 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771349692.907982, 83177619-4b43-46b9-9411-0befd9730f0a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:35:07 compute-0 nova_compute[186479]: 2026-02-17 17:35:07.909 186483 INFO nova.compute.manager [-] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] VM Stopped (Lifecycle Event)
Feb 17 17:35:07 compute-0 nova_compute[186479]: 2026-02-17 17:35:07.931 186483 DEBUG nova.compute.manager [None req-06bd547c-ec0a-4676-afdf-b5baa2ad3ee8 - - - - - -] [instance: 83177619-4b43-46b9-9411-0befd9730f0a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:35:07 compute-0 nova_compute[186479]: 2026-02-17 17:35:07.942 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:09 compute-0 nova_compute[186479]: 2026-02-17 17:35:09.642 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:35:09 compute-0 nova_compute[186479]: 2026-02-17 17:35:09.642 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:35:09 compute-0 nova_compute[186479]: 2026-02-17 17:35:09.643 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 17 17:35:09 compute-0 nova_compute[186479]: 2026-02-17 17:35:09.656 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 17 17:35:10 compute-0 nova_compute[186479]: 2026-02-17 17:35:10.106 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:10.953 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:35:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:10.954 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:35:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:10.954 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:35:11 compute-0 podman[219059]: 2026-02-17 17:35:11.751883591 +0000 UTC m=+0.093739767 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:35:12 compute-0 nova_compute[186479]: 2026-02-17 17:35:12.945 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:15 compute-0 nova_compute[186479]: 2026-02-17 17:35:15.107 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:16 compute-0 podman[219087]: 2026-02-17 17:35:16.720850794 +0000 UTC m=+0.058978218 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 17 17:35:17 compute-0 nova_compute[186479]: 2026-02-17 17:35:17.648 186483 DEBUG oslo_concurrency.lockutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:35:17 compute-0 nova_compute[186479]: 2026-02-17 17:35:17.648 186483 DEBUG oslo_concurrency.lockutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:35:17 compute-0 nova_compute[186479]: 2026-02-17 17:35:17.671 186483 DEBUG nova.compute.manager [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 17 17:35:17 compute-0 nova_compute[186479]: 2026-02-17 17:35:17.751 186483 DEBUG oslo_concurrency.lockutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:35:17 compute-0 nova_compute[186479]: 2026-02-17 17:35:17.752 186483 DEBUG oslo_concurrency.lockutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:35:17 compute-0 nova_compute[186479]: 2026-02-17 17:35:17.761 186483 DEBUG nova.virt.hardware [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 17 17:35:17 compute-0 nova_compute[186479]: 2026-02-17 17:35:17.761 186483 INFO nova.compute.claims [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Claim successful on node compute-0.ctlplane.example.com
Feb 17 17:35:17 compute-0 nova_compute[186479]: 2026-02-17 17:35:17.869 186483 DEBUG nova.compute.provider_tree [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:35:17 compute-0 nova_compute[186479]: 2026-02-17 17:35:17.884 186483 DEBUG nova.scheduler.client.report [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:35:17 compute-0 nova_compute[186479]: 2026-02-17 17:35:17.911 186483 DEBUG oslo_concurrency.lockutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:35:17 compute-0 nova_compute[186479]: 2026-02-17 17:35:17.912 186483 DEBUG nova.compute.manager [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 17 17:35:17 compute-0 nova_compute[186479]: 2026-02-17 17:35:17.948 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:17 compute-0 nova_compute[186479]: 2026-02-17 17:35:17.964 186483 DEBUG nova.compute.manager [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 17 17:35:17 compute-0 nova_compute[186479]: 2026-02-17 17:35:17.964 186483 DEBUG nova.network.neutron [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 17 17:35:17 compute-0 nova_compute[186479]: 2026-02-17 17:35:17.990 186483 INFO nova.virt.libvirt.driver [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.008 186483 DEBUG nova.compute.manager [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.105 186483 DEBUG nova.compute.manager [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.107 186483 DEBUG nova.virt.libvirt.driver [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.107 186483 INFO nova.virt.libvirt.driver [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Creating image(s)
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.108 186483 DEBUG oslo_concurrency.lockutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "/var/lib/nova/instances/7bc19460-80c1-4421-b690-1f1e1ceea9cd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.109 186483 DEBUG oslo_concurrency.lockutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/7bc19460-80c1-4421-b690-1f1e1ceea9cd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.110 186483 DEBUG oslo_concurrency.lockutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/7bc19460-80c1-4421-b690-1f1e1ceea9cd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.134 186483 DEBUG oslo_concurrency.processutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.208 186483 DEBUG oslo_concurrency.processutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.210 186483 DEBUG oslo_concurrency.lockutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.211 186483 DEBUG oslo_concurrency.lockutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.228 186483 DEBUG oslo_concurrency.processutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.295 186483 DEBUG oslo_concurrency.processutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.297 186483 DEBUG oslo_concurrency.processutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/7bc19460-80c1-4421-b690-1f1e1ceea9cd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.329 186483 DEBUG oslo_concurrency.processutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/7bc19460-80c1-4421-b690-1f1e1ceea9cd/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.330 186483 DEBUG oslo_concurrency.lockutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.331 186483 DEBUG oslo_concurrency.processutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.385 186483 DEBUG oslo_concurrency.processutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.386 186483 DEBUG nova.virt.disk.api [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Checking if we can resize image /var/lib/nova/instances/7bc19460-80c1-4421-b690-1f1e1ceea9cd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.386 186483 DEBUG oslo_concurrency.processutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bc19460-80c1-4421-b690-1f1e1ceea9cd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.426 186483 DEBUG nova.policy [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.452 186483 DEBUG oslo_concurrency.processutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bc19460-80c1-4421-b690-1f1e1ceea9cd/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.453 186483 DEBUG nova.virt.disk.api [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Cannot resize image /var/lib/nova/instances/7bc19460-80c1-4421-b690-1f1e1ceea9cd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.454 186483 DEBUG nova.objects.instance [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'migration_context' on Instance uuid 7bc19460-80c1-4421-b690-1f1e1ceea9cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.472 186483 DEBUG nova.virt.libvirt.driver [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.472 186483 DEBUG nova.virt.libvirt.driver [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Ensure instance console log exists: /var/lib/nova/instances/7bc19460-80c1-4421-b690-1f1e1ceea9cd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.473 186483 DEBUG oslo_concurrency.lockutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.474 186483 DEBUG oslo_concurrency.lockutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:35:18 compute-0 nova_compute[186479]: 2026-02-17 17:35:18.474 186483 DEBUG oslo_concurrency.lockutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:35:19 compute-0 nova_compute[186479]: 2026-02-17 17:35:19.002 186483 DEBUG nova.network.neutron [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Successfully created port: a49129ec-46ee-475e-a57d-a06dbd841222 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 17 17:35:19 compute-0 nova_compute[186479]: 2026-02-17 17:35:19.958 186483 DEBUG nova.network.neutron [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Successfully updated port: a49129ec-46ee-475e-a57d-a06dbd841222 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 17 17:35:19 compute-0 nova_compute[186479]: 2026-02-17 17:35:19.975 186483 DEBUG oslo_concurrency.lockutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "refresh_cache-7bc19460-80c1-4421-b690-1f1e1ceea9cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:35:19 compute-0 nova_compute[186479]: 2026-02-17 17:35:19.976 186483 DEBUG oslo_concurrency.lockutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquired lock "refresh_cache-7bc19460-80c1-4421-b690-1f1e1ceea9cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:35:19 compute-0 nova_compute[186479]: 2026-02-17 17:35:19.976 186483 DEBUG nova.network.neutron [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.075 186483 DEBUG nova.compute.manager [req-843840d4-bd40-45a9-b356-cbbc051953c4 req-2d65bbd2-a73a-45f9-ab65-8ee095302b85 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Received event network-changed-a49129ec-46ee-475e-a57d-a06dbd841222 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.076 186483 DEBUG nova.compute.manager [req-843840d4-bd40-45a9-b356-cbbc051953c4 req-2d65bbd2-a73a-45f9-ab65-8ee095302b85 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Refreshing instance network info cache due to event network-changed-a49129ec-46ee-475e-a57d-a06dbd841222. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.076 186483 DEBUG oslo_concurrency.lockutils [req-843840d4-bd40-45a9-b356-cbbc051953c4 req-2d65bbd2-a73a-45f9-ab65-8ee095302b85 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-7bc19460-80c1-4421-b690-1f1e1ceea9cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.110 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.361 186483 DEBUG nova.network.neutron [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 17 17:35:20 compute-0 podman[219126]: 2026-02-17 17:35:20.697936816 +0000 UTC m=+0.042094985 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, container_name=openstack_network_exporter, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z)
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.950 186483 DEBUG nova.network.neutron [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Updating instance_info_cache with network_info: [{"id": "a49129ec-46ee-475e-a57d-a06dbd841222", "address": "fa:16:3e:e8:c3:22", "network": {"id": "3c325a97-6f74-4e81-8e34-a66452f159a5", "bridge": "br-int", "label": "tempest-network-smoke--1931288479", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa49129ec-46", "ovs_interfaceid": "a49129ec-46ee-475e-a57d-a06dbd841222", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.972 186483 DEBUG oslo_concurrency.lockutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Releasing lock "refresh_cache-7bc19460-80c1-4421-b690-1f1e1ceea9cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.973 186483 DEBUG nova.compute.manager [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Instance network_info: |[{"id": "a49129ec-46ee-475e-a57d-a06dbd841222", "address": "fa:16:3e:e8:c3:22", "network": {"id": "3c325a97-6f74-4e81-8e34-a66452f159a5", "bridge": "br-int", "label": "tempest-network-smoke--1931288479", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa49129ec-46", "ovs_interfaceid": "a49129ec-46ee-475e-a57d-a06dbd841222", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.973 186483 DEBUG oslo_concurrency.lockutils [req-843840d4-bd40-45a9-b356-cbbc051953c4 req-2d65bbd2-a73a-45f9-ab65-8ee095302b85 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-7bc19460-80c1-4421-b690-1f1e1ceea9cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.973 186483 DEBUG nova.network.neutron [req-843840d4-bd40-45a9-b356-cbbc051953c4 req-2d65bbd2-a73a-45f9-ab65-8ee095302b85 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Refreshing network info cache for port a49129ec-46ee-475e-a57d-a06dbd841222 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.976 186483 DEBUG nova.virt.libvirt.driver [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Start _get_guest_xml network_info=[{"id": "a49129ec-46ee-475e-a57d-a06dbd841222", "address": "fa:16:3e:e8:c3:22", "network": {"id": "3c325a97-6f74-4e81-8e34-a66452f159a5", "bridge": "br-int", "label": "tempest-network-smoke--1931288479", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa49129ec-46", "ovs_interfaceid": "a49129ec-46ee-475e-a57d-a06dbd841222", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_format': None, 'encryption_options': None, 'size': 0, 'encrypted': False, 'image_id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.980 186483 WARNING nova.virt.libvirt.driver [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.985 186483 DEBUG nova.virt.libvirt.host [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.986 186483 DEBUG nova.virt.libvirt.host [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.989 186483 DEBUG nova.virt.libvirt.host [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.989 186483 DEBUG nova.virt.libvirt.host [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.989 186483 DEBUG nova.virt.libvirt.driver [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.990 186483 DEBUG nova.virt.hardware [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-17T17:28:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='5ebd41ec-8360-4181-bce3-0c0dc586cdb2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.990 186483 DEBUG nova.virt.hardware [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.990 186483 DEBUG nova.virt.hardware [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.990 186483 DEBUG nova.virt.hardware [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.991 186483 DEBUG nova.virt.hardware [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.991 186483 DEBUG nova.virt.hardware [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.991 186483 DEBUG nova.virt.hardware [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.991 186483 DEBUG nova.virt.hardware [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.991 186483 DEBUG nova.virt.hardware [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.991 186483 DEBUG nova.virt.hardware [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.992 186483 DEBUG nova.virt.hardware [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.995 186483 DEBUG nova.virt.libvirt.vif [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:35:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1302438103',display_name='tempest-TestNetworkBasicOps-server-1302438103',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1302438103',id=10,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBExSrlCxXvmVDkQnDNKJnkGjldm8Sx5qjmGL9Kx2Ttx2P3Wh7/fNS/4c+ZhrjBxcEvLznuNc2tZPL4OYjufYmRcKsUy2ZsyuleTY9pCWVUvZBHXplihSYTA+oY5L2MnTxA==',key_name='tempest-TestNetworkBasicOps-357887296',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-rqc0nkci',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:35:18Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=7bc19460-80c1-4421-b690-1f1e1ceea9cd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a49129ec-46ee-475e-a57d-a06dbd841222", "address": "fa:16:3e:e8:c3:22", "network": {"id": "3c325a97-6f74-4e81-8e34-a66452f159a5", "bridge": "br-int", "label": "tempest-network-smoke--1931288479", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa49129ec-46", "ovs_interfaceid": "a49129ec-46ee-475e-a57d-a06dbd841222", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.995 186483 DEBUG nova.network.os_vif_util [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "a49129ec-46ee-475e-a57d-a06dbd841222", "address": "fa:16:3e:e8:c3:22", "network": {"id": "3c325a97-6f74-4e81-8e34-a66452f159a5", "bridge": "br-int", "label": "tempest-network-smoke--1931288479", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa49129ec-46", "ovs_interfaceid": "a49129ec-46ee-475e-a57d-a06dbd841222", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.996 186483 DEBUG nova.network.os_vif_util [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:c3:22,bridge_name='br-int',has_traffic_filtering=True,id=a49129ec-46ee-475e-a57d-a06dbd841222,network=Network(3c325a97-6f74-4e81-8e34-a66452f159a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa49129ec-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:35:20 compute-0 nova_compute[186479]: 2026-02-17 17:35:20.996 186483 DEBUG nova.objects.instance [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'pci_devices' on Instance uuid 7bc19460-80c1-4421-b690-1f1e1ceea9cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.009 186483 DEBUG nova.virt.libvirt.driver [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] End _get_guest_xml xml=<domain type="kvm">
Feb 17 17:35:21 compute-0 nova_compute[186479]:   <uuid>7bc19460-80c1-4421-b690-1f1e1ceea9cd</uuid>
Feb 17 17:35:21 compute-0 nova_compute[186479]:   <name>instance-0000000a</name>
Feb 17 17:35:21 compute-0 nova_compute[186479]:   <memory>131072</memory>
Feb 17 17:35:21 compute-0 nova_compute[186479]:   <vcpu>1</vcpu>
Feb 17 17:35:21 compute-0 nova_compute[186479]:   <metadata>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <nova:name>tempest-TestNetworkBasicOps-server-1302438103</nova:name>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <nova:creationTime>2026-02-17 17:35:20</nova:creationTime>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <nova:flavor name="m1.nano">
Feb 17 17:35:21 compute-0 nova_compute[186479]:         <nova:memory>128</nova:memory>
Feb 17 17:35:21 compute-0 nova_compute[186479]:         <nova:disk>1</nova:disk>
Feb 17 17:35:21 compute-0 nova_compute[186479]:         <nova:swap>0</nova:swap>
Feb 17 17:35:21 compute-0 nova_compute[186479]:         <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:35:21 compute-0 nova_compute[186479]:         <nova:vcpus>1</nova:vcpus>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       </nova:flavor>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <nova:owner>
Feb 17 17:35:21 compute-0 nova_compute[186479]:         <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:35:21 compute-0 nova_compute[186479]:         <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       </nova:owner>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <nova:ports>
Feb 17 17:35:21 compute-0 nova_compute[186479]:         <nova:port uuid="a49129ec-46ee-475e-a57d-a06dbd841222">
Feb 17 17:35:21 compute-0 nova_compute[186479]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:         </nova:port>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       </nova:ports>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     </nova:instance>
Feb 17 17:35:21 compute-0 nova_compute[186479]:   </metadata>
Feb 17 17:35:21 compute-0 nova_compute[186479]:   <sysinfo type="smbios">
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <system>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <entry name="manufacturer">RDO</entry>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <entry name="product">OpenStack Compute</entry>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <entry name="serial">7bc19460-80c1-4421-b690-1f1e1ceea9cd</entry>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <entry name="uuid">7bc19460-80c1-4421-b690-1f1e1ceea9cd</entry>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <entry name="family">Virtual Machine</entry>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     </system>
Feb 17 17:35:21 compute-0 nova_compute[186479]:   </sysinfo>
Feb 17 17:35:21 compute-0 nova_compute[186479]:   <os>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <boot dev="hd"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <smbios mode="sysinfo"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:   </os>
Feb 17 17:35:21 compute-0 nova_compute[186479]:   <features>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <acpi/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <apic/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <vmcoreinfo/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:   </features>
Feb 17 17:35:21 compute-0 nova_compute[186479]:   <clock offset="utc">
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <timer name="pit" tickpolicy="delay"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <timer name="hpet" present="no"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:   </clock>
Feb 17 17:35:21 compute-0 nova_compute[186479]:   <cpu mode="host-model" match="exact">
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <topology sockets="1" cores="1" threads="1"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:35:21 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <disk type="file" device="disk">
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/7bc19460-80c1-4421-b690-1f1e1ceea9cd/disk"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <target dev="vda" bus="virtio"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <disk type="file" device="cdrom">
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <driver name="qemu" type="raw" cache="none"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/7bc19460-80c1-4421-b690-1f1e1ceea9cd/disk.config"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <target dev="sda" bus="sata"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <interface type="ethernet">
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <mac address="fa:16:3e:e8:c3:22"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <driver name="vhost" rx_queue_size="512"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <mtu size="1442"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <target dev="tapa49129ec-46"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <serial type="pty">
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <log file="/var/lib/nova/instances/7bc19460-80c1-4421-b690-1f1e1ceea9cd/console.log" append="off"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     </serial>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <video>
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     </video>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <input type="tablet" bus="usb"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <rng model="virtio">
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <backend model="random">/dev/urandom</backend>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <controller type="usb" index="0"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     <memballoon model="virtio">
Feb 17 17:35:21 compute-0 nova_compute[186479]:       <stats period="10"/>
Feb 17 17:35:21 compute-0 nova_compute[186479]:     </memballoon>
Feb 17 17:35:21 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:35:21 compute-0 nova_compute[186479]: </domain>
Feb 17 17:35:21 compute-0 nova_compute[186479]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.010 186483 DEBUG nova.compute.manager [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Preparing to wait for external event network-vif-plugged-a49129ec-46ee-475e-a57d-a06dbd841222 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.010 186483 DEBUG oslo_concurrency.lockutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.010 186483 DEBUG oslo_concurrency.lockutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.010 186483 DEBUG oslo_concurrency.lockutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.011 186483 DEBUG nova.virt.libvirt.vif [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:35:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1302438103',display_name='tempest-TestNetworkBasicOps-server-1302438103',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1302438103',id=10,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBExSrlCxXvmVDkQnDNKJnkGjldm8Sx5qjmGL9Kx2Ttx2P3Wh7/fNS/4c+ZhrjBxcEvLznuNc2tZPL4OYjufYmRcKsUy2ZsyuleTY9pCWVUvZBHXplihSYTA+oY5L2MnTxA==',key_name='tempest-TestNetworkBasicOps-357887296',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-rqc0nkci',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:35:18Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=7bc19460-80c1-4421-b690-1f1e1ceea9cd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a49129ec-46ee-475e-a57d-a06dbd841222", "address": "fa:16:3e:e8:c3:22", "network": {"id": "3c325a97-6f74-4e81-8e34-a66452f159a5", "bridge": "br-int", "label": "tempest-network-smoke--1931288479", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa49129ec-46", "ovs_interfaceid": "a49129ec-46ee-475e-a57d-a06dbd841222", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.011 186483 DEBUG nova.network.os_vif_util [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "a49129ec-46ee-475e-a57d-a06dbd841222", "address": "fa:16:3e:e8:c3:22", "network": {"id": "3c325a97-6f74-4e81-8e34-a66452f159a5", "bridge": "br-int", "label": "tempest-network-smoke--1931288479", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa49129ec-46", "ovs_interfaceid": "a49129ec-46ee-475e-a57d-a06dbd841222", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.012 186483 DEBUG nova.network.os_vif_util [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:c3:22,bridge_name='br-int',has_traffic_filtering=True,id=a49129ec-46ee-475e-a57d-a06dbd841222,network=Network(3c325a97-6f74-4e81-8e34-a66452f159a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa49129ec-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.012 186483 DEBUG os_vif [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:c3:22,bridge_name='br-int',has_traffic_filtering=True,id=a49129ec-46ee-475e-a57d-a06dbd841222,network=Network(3c325a97-6f74-4e81-8e34-a66452f159a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa49129ec-46') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.012 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.013 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.013 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.016 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.017 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa49129ec-46, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.017 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa49129ec-46, col_values=(('external_ids', {'iface-id': 'a49129ec-46ee-475e-a57d-a06dbd841222', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e8:c3:22', 'vm-uuid': '7bc19460-80c1-4421-b690-1f1e1ceea9cd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.019 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:21 compute-0 NetworkManager[56323]: <info>  [1771349721.0199] manager: (tapa49129ec-46): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.022 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.024 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.025 186483 INFO os_vif [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:c3:22,bridge_name='br-int',has_traffic_filtering=True,id=a49129ec-46ee-475e-a57d-a06dbd841222,network=Network(3c325a97-6f74-4e81-8e34-a66452f159a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa49129ec-46')
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.067 186483 DEBUG nova.virt.libvirt.driver [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.068 186483 DEBUG nova.virt.libvirt.driver [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.068 186483 DEBUG nova.virt.libvirt.driver [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No VIF found with MAC fa:16:3e:e8:c3:22, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.069 186483 INFO nova.virt.libvirt.driver [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Using config drive
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.595 186483 INFO nova.virt.libvirt.driver [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Creating config drive at /var/lib/nova/instances/7bc19460-80c1-4421-b690-1f1e1ceea9cd/disk.config
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.600 186483 DEBUG oslo_concurrency.processutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7bc19460-80c1-4421-b690-1f1e1ceea9cd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpiztha5ft execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.724 186483 DEBUG oslo_concurrency.processutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7bc19460-80c1-4421-b690-1f1e1ceea9cd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpiztha5ft" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:35:21 compute-0 kernel: tapa49129ec-46: entered promiscuous mode
Feb 17 17:35:21 compute-0 NetworkManager[56323]: <info>  [1771349721.7855] manager: (tapa49129ec-46): new Tun device (/org/freedesktop/NetworkManager/Devices/73)
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.787 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:21 compute-0 ovn_controller[96568]: 2026-02-17T17:35:21Z|00134|binding|INFO|Claiming lport a49129ec-46ee-475e-a57d-a06dbd841222 for this chassis.
Feb 17 17:35:21 compute-0 ovn_controller[96568]: 2026-02-17T17:35:21Z|00135|binding|INFO|a49129ec-46ee-475e-a57d-a06dbd841222: Claiming fa:16:3e:e8:c3:22 10.100.0.10
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.791 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.795 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.798 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:21.811 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:c3:22 10.100.0.10'], port_security=['fa:16:3e:e8:c3:22 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '7bc19460-80c1-4421-b690-1f1e1ceea9cd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c325a97-6f74-4e81-8e34-a66452f159a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6a913505-4891-4087-93c0-bc7240143da4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=43ced29f-4a8f-44d6-9255-5822f72c7045, chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=a49129ec-46ee-475e-a57d-a06dbd841222) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:35:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:21.813 105898 INFO neutron.agent.ovn.metadata.agent [-] Port a49129ec-46ee-475e-a57d-a06dbd841222 in datapath 3c325a97-6f74-4e81-8e34-a66452f159a5 bound to our chassis
Feb 17 17:35:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:21.814 105898 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3c325a97-6f74-4e81-8e34-a66452f159a5
Feb 17 17:35:21 compute-0 systemd-udevd[219165]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:35:21 compute-0 ovn_controller[96568]: 2026-02-17T17:35:21Z|00136|binding|INFO|Setting lport a49129ec-46ee-475e-a57d-a06dbd841222 ovn-installed in OVS
Feb 17 17:35:21 compute-0 ovn_controller[96568]: 2026-02-17T17:35:21Z|00137|binding|INFO|Setting lport a49129ec-46ee-475e-a57d-a06dbd841222 up in Southbound
Feb 17 17:35:21 compute-0 systemd-machined[155877]: New machine qemu-10-instance-0000000a.
Feb 17 17:35:21 compute-0 nova_compute[186479]: 2026-02-17 17:35:21.826 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:21.827 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[1eec9038-4297-42fc-8272-0e1ec62c33b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:21.829 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3c325a97-61 in ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 17 17:35:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:21.831 215208 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3c325a97-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 17 17:35:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:21.831 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[ebaf235b-09eb-4066-af80-06774b5d3109]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:21.833 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[4dd75793-d09e-4f15-bacc-0580bada0ea7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:21 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000a.
Feb 17 17:35:21 compute-0 NetworkManager[56323]: <info>  [1771349721.8394] device (tapa49129ec-46): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 17 17:35:21 compute-0 NetworkManager[56323]: <info>  [1771349721.8408] device (tapa49129ec-46): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 17 17:35:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:21.842 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[e93a05b1-8a45-41e6-a4f3-f04a1cb1bc42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:21.856 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[ec144003-c5eb-4298-a6eb-0c6beb94340e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:21.878 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[15662ee9-5641-4367-b607-6fac31adc57a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:21 compute-0 NetworkManager[56323]: <info>  [1771349721.8867] manager: (tap3c325a97-60): new Veth device (/org/freedesktop/NetworkManager/Devices/74)
Feb 17 17:35:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:21.885 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[a1cb8b23-a537-4bcd-ac11-4fb87ba94019]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:21 compute-0 systemd-udevd[219170]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:35:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:21.913 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[c0d21bd0-3fc5-421a-909a-6e06a1f71881]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:21.917 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[b3676ece-64a5-4425-a2f5-f929db9194b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:21 compute-0 NetworkManager[56323]: <info>  [1771349721.9366] device (tap3c325a97-60): carrier: link connected
Feb 17 17:35:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:21.942 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[6371cd16-5b52-4fb5-926c-81a5f7920f61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:21.959 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[8966d4a6-9ba4-4480-a4eb-e8344e03aa7b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3c325a97-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:3e:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347724, 'reachable_time': 20524, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219199, 'error': None, 'target': 'ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:21.977 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[c48943f6-ab70-4ed4-86f6-d82e4213990a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedd:3e05'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 347724, 'tstamp': 347724}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219200, 'error': None, 'target': 'ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:21 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:21.993 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[91bf5d0d-7e33-4bc6-b0d0-ce3fb9cfc055]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3c325a97-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:3e:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347724, 'reachable_time': 20524, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219201, 'error': None, 'target': 'ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:22.022 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[b3b5c3ff-7b2f-4f41-9e4d-aa12675bc447]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:22.087 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[7b076633-57d0-4689-8755-efdd610f6699]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:22.088 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c325a97-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:22.089 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:22.090 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c325a97-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.092 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:22 compute-0 kernel: tap3c325a97-60: entered promiscuous mode
Feb 17 17:35:22 compute-0 NetworkManager[56323]: <info>  [1771349722.0946] manager: (tap3c325a97-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.095 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:22.096 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3c325a97-60, col_values=(('external_ids', {'iface-id': '9b129eac-7631-4245-a05f-74b721b8e22a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.098 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:22 compute-0 ovn_controller[96568]: 2026-02-17T17:35:22Z|00138|binding|INFO|Releasing lport 9b129eac-7631-4245-a05f-74b721b8e22a from this chassis (sb_readonly=0)
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.098 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:22.099 105898 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3c325a97-6f74-4e81-8e34-a66452f159a5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3c325a97-6f74-4e81-8e34-a66452f159a5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:22.100 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[eccaa191-e1fa-4d36-951f-d2a8439b8b81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.101 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:22.101 105898 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]: global
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:     log         /dev/log local0 debug
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:     log-tag     haproxy-metadata-proxy-3c325a97-6f74-4e81-8e34-a66452f159a5
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:     user        root
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:     group       root
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:     maxconn     1024
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:     pidfile     /var/lib/neutron/external/pids/3c325a97-6f74-4e81-8e34-a66452f159a5.pid.haproxy
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:     daemon
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]: defaults
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:     log global
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:     mode http
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:     option httplog
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:     option dontlognull
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:     option http-server-close
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:     option forwardfor
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:     retries                 3
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:     timeout http-request    30s
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:     timeout connect         30s
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:     timeout client          32s
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:     timeout server          32s
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:     timeout http-keep-alive 30s
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]: listen listener
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:     bind 169.254.169.254:80
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:     server metadata /var/lib/neutron/metadata_proxy
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:     http-request add-header X-OVN-Network-ID 3c325a97-6f74-4e81-8e34-a66452f159a5
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 17 17:35:22 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:22.102 105898 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5', 'env', 'PROCESS_TAG=haproxy-3c325a97-6f74-4e81-8e34-a66452f159a5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3c325a97-6f74-4e81-8e34-a66452f159a5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.157 186483 DEBUG nova.compute.manager [req-fa05ed33-c62d-42fc-998f-787bdab2ce25 req-ae21550a-ac16-4c50-896b-de91308a73ba 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Received event network-vif-plugged-a49129ec-46ee-475e-a57d-a06dbd841222 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.158 186483 DEBUG oslo_concurrency.lockutils [req-fa05ed33-c62d-42fc-998f-787bdab2ce25 req-ae21550a-ac16-4c50-896b-de91308a73ba 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.158 186483 DEBUG oslo_concurrency.lockutils [req-fa05ed33-c62d-42fc-998f-787bdab2ce25 req-ae21550a-ac16-4c50-896b-de91308a73ba 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.159 186483 DEBUG oslo_concurrency.lockutils [req-fa05ed33-c62d-42fc-998f-787bdab2ce25 req-ae21550a-ac16-4c50-896b-de91308a73ba 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.159 186483 DEBUG nova.compute.manager [req-fa05ed33-c62d-42fc-998f-787bdab2ce25 req-ae21550a-ac16-4c50-896b-de91308a73ba 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Processing event network-vif-plugged-a49129ec-46ee-475e-a57d-a06dbd841222 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.201 186483 DEBUG nova.network.neutron [req-843840d4-bd40-45a9-b356-cbbc051953c4 req-2d65bbd2-a73a-45f9-ab65-8ee095302b85 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Updated VIF entry in instance network info cache for port a49129ec-46ee-475e-a57d-a06dbd841222. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.202 186483 DEBUG nova.network.neutron [req-843840d4-bd40-45a9-b356-cbbc051953c4 req-2d65bbd2-a73a-45f9-ab65-8ee095302b85 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Updating instance_info_cache with network_info: [{"id": "a49129ec-46ee-475e-a57d-a06dbd841222", "address": "fa:16:3e:e8:c3:22", "network": {"id": "3c325a97-6f74-4e81-8e34-a66452f159a5", "bridge": "br-int", "label": "tempest-network-smoke--1931288479", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa49129ec-46", "ovs_interfaceid": "a49129ec-46ee-475e-a57d-a06dbd841222", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.218 186483 DEBUG oslo_concurrency.lockutils [req-843840d4-bd40-45a9-b356-cbbc051953c4 req-2d65bbd2-a73a-45f9-ab65-8ee095302b85 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-7bc19460-80c1-4421-b690-1f1e1ceea9cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:35:22 compute-0 podman[219235]: 2026-02-17 17:35:22.435843186 +0000 UTC m=+0.052489032 container create 510730402c2cba757d490624da84bb19fcc9472c8474aaaf714dc01976e792ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.454 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349722.4542377, 7bc19460-80c1-4421-b690-1f1e1ceea9cd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.455 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] VM Started (Lifecycle Event)
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.457 186483 DEBUG nova.compute.manager [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.461 186483 DEBUG nova.virt.libvirt.driver [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.464 186483 INFO nova.virt.libvirt.driver [-] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Instance spawned successfully.
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.464 186483 DEBUG nova.virt.libvirt.driver [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.472 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.474 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:35:22 compute-0 systemd[1]: Started libpod-conmon-510730402c2cba757d490624da84bb19fcc9472c8474aaaf714dc01976e792ac.scope.
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.485 186483 DEBUG nova.virt.libvirt.driver [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.485 186483 DEBUG nova.virt.libvirt.driver [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.486 186483 DEBUG nova.virt.libvirt.driver [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.486 186483 DEBUG nova.virt.libvirt.driver [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.486 186483 DEBUG nova.virt.libvirt.driver [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.487 186483 DEBUG nova.virt.libvirt.driver [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:35:22 compute-0 podman[219235]: 2026-02-17 17:35:22.40204783 +0000 UTC m=+0.018693666 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 17 17:35:22 compute-0 systemd[1]: Started libcrun container.
Feb 17 17:35:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88c8bf368777d17ba68dd9b8daf7b2d6ddf746290d546d55396adf536e41afb0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.512 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.513 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349722.4544456, 7bc19460-80c1-4421-b690-1f1e1ceea9cd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.513 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] VM Paused (Lifecycle Event)
Feb 17 17:35:22 compute-0 podman[219235]: 2026-02-17 17:35:22.522169346 +0000 UTC m=+0.138815182 container init 510730402c2cba757d490624da84bb19fcc9472c8474aaaf714dc01976e792ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 17 17:35:22 compute-0 podman[219235]: 2026-02-17 17:35:22.529144233 +0000 UTC m=+0.145790049 container start 510730402c2cba757d490624da84bb19fcc9472c8474aaaf714dc01976e792ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.544 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.547 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349722.4602847, 7bc19460-80c1-4421-b690-1f1e1ceea9cd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.547 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] VM Resumed (Lifecycle Event)
Feb 17 17:35:22 compute-0 neutron-haproxy-ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5[219255]: [NOTICE]   (219259) : New worker (219261) forked
Feb 17 17:35:22 compute-0 neutron-haproxy-ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5[219255]: [NOTICE]   (219259) : Loading success.
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.560 186483 INFO nova.compute.manager [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Took 4.45 seconds to spawn the instance on the hypervisor.
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.561 186483 DEBUG nova.compute.manager [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.569 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.570 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.595 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.628 186483 INFO nova.compute.manager [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Took 4.91 seconds to build instance.
Feb 17 17:35:22 compute-0 nova_compute[186479]: 2026-02-17 17:35:22.644 186483 DEBUG oslo_concurrency.lockutils [None req-ce7d3f6c-8094-4ed6-9da5-2ab51aea9a45 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.996s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:35:24 compute-0 nova_compute[186479]: 2026-02-17 17:35:24.240 186483 DEBUG nova.compute.manager [req-e48ddd16-0a3f-4e89-bb8d-d82c4f07215a req-93f1d07e-5db4-472c-a3a8-11fe4dc8d41e 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Received event network-vif-plugged-a49129ec-46ee-475e-a57d-a06dbd841222 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:35:24 compute-0 nova_compute[186479]: 2026-02-17 17:35:24.241 186483 DEBUG oslo_concurrency.lockutils [req-e48ddd16-0a3f-4e89-bb8d-d82c4f07215a req-93f1d07e-5db4-472c-a3a8-11fe4dc8d41e 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:35:24 compute-0 nova_compute[186479]: 2026-02-17 17:35:24.241 186483 DEBUG oslo_concurrency.lockutils [req-e48ddd16-0a3f-4e89-bb8d-d82c4f07215a req-93f1d07e-5db4-472c-a3a8-11fe4dc8d41e 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:35:24 compute-0 nova_compute[186479]: 2026-02-17 17:35:24.241 186483 DEBUG oslo_concurrency.lockutils [req-e48ddd16-0a3f-4e89-bb8d-d82c4f07215a req-93f1d07e-5db4-472c-a3a8-11fe4dc8d41e 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:35:24 compute-0 nova_compute[186479]: 2026-02-17 17:35:24.242 186483 DEBUG nova.compute.manager [req-e48ddd16-0a3f-4e89-bb8d-d82c4f07215a req-93f1d07e-5db4-472c-a3a8-11fe4dc8d41e 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] No waiting events found dispatching network-vif-plugged-a49129ec-46ee-475e-a57d-a06dbd841222 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:35:24 compute-0 nova_compute[186479]: 2026-02-17 17:35:24.242 186483 WARNING nova.compute.manager [req-e48ddd16-0a3f-4e89-bb8d-d82c4f07215a req-93f1d07e-5db4-472c-a3a8-11fe4dc8d41e 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Received unexpected event network-vif-plugged-a49129ec-46ee-475e-a57d-a06dbd841222 for instance with vm_state active and task_state None.
Feb 17 17:35:25 compute-0 nova_compute[186479]: 2026-02-17 17:35:25.111 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:25 compute-0 ovn_controller[96568]: 2026-02-17T17:35:25Z|00139|binding|INFO|Releasing lport 9b129eac-7631-4245-a05f-74b721b8e22a from this chassis (sb_readonly=0)
Feb 17 17:35:25 compute-0 nova_compute[186479]: 2026-02-17 17:35:25.691 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:25 compute-0 NetworkManager[56323]: <info>  [1771349725.6919] manager: (patch-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Feb 17 17:35:25 compute-0 NetworkManager[56323]: <info>  [1771349725.6928] manager: (patch-br-int-to-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Feb 17 17:35:25 compute-0 ovn_controller[96568]: 2026-02-17T17:35:25Z|00140|binding|INFO|Releasing lport 9b129eac-7631-4245-a05f-74b721b8e22a from this chassis (sb_readonly=0)
Feb 17 17:35:25 compute-0 nova_compute[186479]: 2026-02-17 17:35:25.709 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:25 compute-0 nova_compute[186479]: 2026-02-17 17:35:25.715 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:26 compute-0 nova_compute[186479]: 2026-02-17 17:35:26.019 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:26 compute-0 nova_compute[186479]: 2026-02-17 17:35:26.310 186483 DEBUG nova.compute.manager [req-99be7293-c835-4c28-b823-bf478ea083d7 req-18be1ac4-4d4c-45e5-b3ab-de934ac152c3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Received event network-changed-a49129ec-46ee-475e-a57d-a06dbd841222 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:35:26 compute-0 nova_compute[186479]: 2026-02-17 17:35:26.310 186483 DEBUG nova.compute.manager [req-99be7293-c835-4c28-b823-bf478ea083d7 req-18be1ac4-4d4c-45e5-b3ab-de934ac152c3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Refreshing instance network info cache due to event network-changed-a49129ec-46ee-475e-a57d-a06dbd841222. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:35:26 compute-0 nova_compute[186479]: 2026-02-17 17:35:26.310 186483 DEBUG oslo_concurrency.lockutils [req-99be7293-c835-4c28-b823-bf478ea083d7 req-18be1ac4-4d4c-45e5-b3ab-de934ac152c3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-7bc19460-80c1-4421-b690-1f1e1ceea9cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:35:26 compute-0 nova_compute[186479]: 2026-02-17 17:35:26.310 186483 DEBUG oslo_concurrency.lockutils [req-99be7293-c835-4c28-b823-bf478ea083d7 req-18be1ac4-4d4c-45e5-b3ab-de934ac152c3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-7bc19460-80c1-4421-b690-1f1e1ceea9cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:35:26 compute-0 nova_compute[186479]: 2026-02-17 17:35:26.311 186483 DEBUG nova.network.neutron [req-99be7293-c835-4c28-b823-bf478ea083d7 req-18be1ac4-4d4c-45e5-b3ab-de934ac152c3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Refreshing network info cache for port a49129ec-46ee-475e-a57d-a06dbd841222 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:35:27 compute-0 nova_compute[186479]: 2026-02-17 17:35:27.162 186483 DEBUG nova.network.neutron [req-99be7293-c835-4c28-b823-bf478ea083d7 req-18be1ac4-4d4c-45e5-b3ab-de934ac152c3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Updated VIF entry in instance network info cache for port a49129ec-46ee-475e-a57d-a06dbd841222. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:35:27 compute-0 nova_compute[186479]: 2026-02-17 17:35:27.163 186483 DEBUG nova.network.neutron [req-99be7293-c835-4c28-b823-bf478ea083d7 req-18be1ac4-4d4c-45e5-b3ab-de934ac152c3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Updating instance_info_cache with network_info: [{"id": "a49129ec-46ee-475e-a57d-a06dbd841222", "address": "fa:16:3e:e8:c3:22", "network": {"id": "3c325a97-6f74-4e81-8e34-a66452f159a5", "bridge": "br-int", "label": "tempest-network-smoke--1931288479", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa49129ec-46", "ovs_interfaceid": "a49129ec-46ee-475e-a57d-a06dbd841222", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:35:27 compute-0 nova_compute[186479]: 2026-02-17 17:35:27.197 186483 DEBUG oslo_concurrency.lockutils [req-99be7293-c835-4c28-b823-bf478ea083d7 req-18be1ac4-4d4c-45e5-b3ab-de934ac152c3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-7bc19460-80c1-4421-b690-1f1e1ceea9cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:35:28 compute-0 podman[219271]: 2026-02-17 17:35:28.724172336 +0000 UTC m=+0.065666247 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 17 17:35:28 compute-0 podman[219272]: 2026-02-17 17:35:28.753273051 +0000 UTC m=+0.086865954 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:35:30 compute-0 nova_compute[186479]: 2026-02-17 17:35:30.113 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:31 compute-0 nova_compute[186479]: 2026-02-17 17:35:31.021 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:34 compute-0 podman[219325]: 2026-02-17 17:35:34.707710825 +0000 UTC m=+0.049770718 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 17 17:35:35 compute-0 nova_compute[186479]: 2026-02-17 17:35:35.116 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:35 compute-0 ovn_controller[96568]: 2026-02-17T17:35:35Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e8:c3:22 10.100.0.10
Feb 17 17:35:35 compute-0 ovn_controller[96568]: 2026-02-17T17:35:35Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e8:c3:22 10.100.0.10
Feb 17 17:35:36 compute-0 nova_compute[186479]: 2026-02-17 17:35:36.024 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:40 compute-0 nova_compute[186479]: 2026-02-17 17:35:40.118 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:40 compute-0 nova_compute[186479]: 2026-02-17 17:35:40.394 186483 INFO nova.compute.manager [None req-88b4e18a-6c1e-469d-ae2f-8215942d506a 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Get console output
Feb 17 17:35:40 compute-0 nova_compute[186479]: 2026-02-17 17:35:40.399 215112 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 17 17:35:41 compute-0 nova_compute[186479]: 2026-02-17 17:35:41.027 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:42 compute-0 ovn_controller[96568]: 2026-02-17T17:35:42Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e8:c3:22 10.100.0.10
Feb 17 17:35:42 compute-0 podman[219349]: 2026-02-17 17:35:42.736771075 +0000 UTC m=+0.078176325 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Feb 17 17:35:45 compute-0 nova_compute[186479]: 2026-02-17 17:35:45.120 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:45 compute-0 nova_compute[186479]: 2026-02-17 17:35:45.785 186483 DEBUG nova.compute.manager [req-7a54eec9-0e02-4834-9f43-a41d55c2a25f req-fc62e9c7-4e1f-4d74-b3ef-57dd85379bb0 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Received event network-changed-a49129ec-46ee-475e-a57d-a06dbd841222 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:35:45 compute-0 nova_compute[186479]: 2026-02-17 17:35:45.786 186483 DEBUG nova.compute.manager [req-7a54eec9-0e02-4834-9f43-a41d55c2a25f req-fc62e9c7-4e1f-4d74-b3ef-57dd85379bb0 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Refreshing instance network info cache due to event network-changed-a49129ec-46ee-475e-a57d-a06dbd841222. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:35:45 compute-0 nova_compute[186479]: 2026-02-17 17:35:45.786 186483 DEBUG oslo_concurrency.lockutils [req-7a54eec9-0e02-4834-9f43-a41d55c2a25f req-fc62e9c7-4e1f-4d74-b3ef-57dd85379bb0 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-7bc19460-80c1-4421-b690-1f1e1ceea9cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:35:45 compute-0 nova_compute[186479]: 2026-02-17 17:35:45.786 186483 DEBUG oslo_concurrency.lockutils [req-7a54eec9-0e02-4834-9f43-a41d55c2a25f req-fc62e9c7-4e1f-4d74-b3ef-57dd85379bb0 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-7bc19460-80c1-4421-b690-1f1e1ceea9cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:35:45 compute-0 nova_compute[186479]: 2026-02-17 17:35:45.786 186483 DEBUG nova.network.neutron [req-7a54eec9-0e02-4834-9f43-a41d55c2a25f req-fc62e9c7-4e1f-4d74-b3ef-57dd85379bb0 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Refreshing network info cache for port a49129ec-46ee-475e-a57d-a06dbd841222 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:35:45 compute-0 nova_compute[186479]: 2026-02-17 17:35:45.886 186483 DEBUG oslo_concurrency.lockutils [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:35:45 compute-0 nova_compute[186479]: 2026-02-17 17:35:45.887 186483 DEBUG oslo_concurrency.lockutils [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:35:45 compute-0 nova_compute[186479]: 2026-02-17 17:35:45.887 186483 DEBUG oslo_concurrency.lockutils [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:35:45 compute-0 nova_compute[186479]: 2026-02-17 17:35:45.888 186483 DEBUG oslo_concurrency.lockutils [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:35:45 compute-0 nova_compute[186479]: 2026-02-17 17:35:45.889 186483 DEBUG oslo_concurrency.lockutils [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:35:45 compute-0 nova_compute[186479]: 2026-02-17 17:35:45.891 186483 INFO nova.compute.manager [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Terminating instance
Feb 17 17:35:45 compute-0 nova_compute[186479]: 2026-02-17 17:35:45.892 186483 DEBUG nova.compute.manager [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 17 17:35:45 compute-0 kernel: tapa49129ec-46 (unregistering): left promiscuous mode
Feb 17 17:35:45 compute-0 NetworkManager[56323]: <info>  [1771349745.9264] device (tapa49129ec-46): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 17 17:35:45 compute-0 ovn_controller[96568]: 2026-02-17T17:35:45Z|00141|binding|INFO|Releasing lport a49129ec-46ee-475e-a57d-a06dbd841222 from this chassis (sb_readonly=0)
Feb 17 17:35:45 compute-0 nova_compute[186479]: 2026-02-17 17:35:45.935 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:45 compute-0 ovn_controller[96568]: 2026-02-17T17:35:45Z|00142|binding|INFO|Setting lport a49129ec-46ee-475e-a57d-a06dbd841222 down in Southbound
Feb 17 17:35:45 compute-0 ovn_controller[96568]: 2026-02-17T17:35:45Z|00143|binding|INFO|Removing iface tapa49129ec-46 ovn-installed in OVS
Feb 17 17:35:45 compute-0 nova_compute[186479]: 2026-02-17 17:35:45.939 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:45 compute-0 nova_compute[186479]: 2026-02-17 17:35:45.944 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:45 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:45.948 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:c3:22 10.100.0.10'], port_security=['fa:16:3e:e8:c3:22 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '7bc19460-80c1-4421-b690-1f1e1ceea9cd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c325a97-6f74-4e81-8e34-a66452f159a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6a913505-4891-4087-93c0-bc7240143da4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=43ced29f-4a8f-44d6-9255-5822f72c7045, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=a49129ec-46ee-475e-a57d-a06dbd841222) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:35:45 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:45.950 105898 INFO neutron.agent.ovn.metadata.agent [-] Port a49129ec-46ee-475e-a57d-a06dbd841222 in datapath 3c325a97-6f74-4e81-8e34-a66452f159a5 unbound from our chassis
Feb 17 17:35:45 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:45.952 105898 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3c325a97-6f74-4e81-8e34-a66452f159a5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 17 17:35:45 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:45.953 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[f154759d-a015-4db0-b834-b1bfb0db4871]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:45 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:45.954 105898 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5 namespace which is not needed anymore
Feb 17 17:35:45 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Feb 17 17:35:45 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Consumed 12.972s CPU time.
Feb 17 17:35:45 compute-0 systemd-machined[155877]: Machine qemu-10-instance-0000000a terminated.
Feb 17 17:35:46 compute-0 nova_compute[186479]: 2026-02-17 17:35:46.028 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:46 compute-0 neutron-haproxy-ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5[219255]: [NOTICE]   (219259) : haproxy version is 2.8.14-c23fe91
Feb 17 17:35:46 compute-0 neutron-haproxy-ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5[219255]: [NOTICE]   (219259) : path to executable is /usr/sbin/haproxy
Feb 17 17:35:46 compute-0 neutron-haproxy-ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5[219255]: [WARNING]  (219259) : Exiting Master process...
Feb 17 17:35:46 compute-0 neutron-haproxy-ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5[219255]: [WARNING]  (219259) : Exiting Master process...
Feb 17 17:35:46 compute-0 neutron-haproxy-ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5[219255]: [ALERT]    (219259) : Current worker (219261) exited with code 143 (Terminated)
Feb 17 17:35:46 compute-0 neutron-haproxy-ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5[219255]: [WARNING]  (219259) : All workers exited. Exiting... (0)
Feb 17 17:35:46 compute-0 systemd[1]: libpod-510730402c2cba757d490624da84bb19fcc9472c8474aaaf714dc01976e792ac.scope: Deactivated successfully.
Feb 17 17:35:46 compute-0 podman[219403]: 2026-02-17 17:35:46.104384006 +0000 UTC m=+0.056703204 container died 510730402c2cba757d490624da84bb19fcc9472c8474aaaf714dc01976e792ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 17 17:35:46 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-510730402c2cba757d490624da84bb19fcc9472c8474aaaf714dc01976e792ac-userdata-shm.mount: Deactivated successfully.
Feb 17 17:35:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-88c8bf368777d17ba68dd9b8daf7b2d6ddf746290d546d55396adf536e41afb0-merged.mount: Deactivated successfully.
Feb 17 17:35:46 compute-0 podman[219403]: 2026-02-17 17:35:46.154643265 +0000 UTC m=+0.106962493 container cleanup 510730402c2cba757d490624da84bb19fcc9472c8474aaaf714dc01976e792ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 17 17:35:46 compute-0 nova_compute[186479]: 2026-02-17 17:35:46.154 186483 INFO nova.virt.libvirt.driver [-] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Instance destroyed successfully.
Feb 17 17:35:46 compute-0 nova_compute[186479]: 2026-02-17 17:35:46.154 186483 DEBUG nova.objects.instance [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'resources' on Instance uuid 7bc19460-80c1-4421-b690-1f1e1ceea9cd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:35:46 compute-0 systemd[1]: libpod-conmon-510730402c2cba757d490624da84bb19fcc9472c8474aaaf714dc01976e792ac.scope: Deactivated successfully.
Feb 17 17:35:46 compute-0 nova_compute[186479]: 2026-02-17 17:35:46.173 186483 DEBUG nova.virt.libvirt.vif [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:35:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1302438103',display_name='tempest-TestNetworkBasicOps-server-1302438103',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1302438103',id=10,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBExSrlCxXvmVDkQnDNKJnkGjldm8Sx5qjmGL9Kx2Ttx2P3Wh7/fNS/4c+ZhrjBxcEvLznuNc2tZPL4OYjufYmRcKsUy2ZsyuleTY9pCWVUvZBHXplihSYTA+oY5L2MnTxA==',key_name='tempest-TestNetworkBasicOps-357887296',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:35:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-rqc0nkci',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:35:22Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=7bc19460-80c1-4421-b690-1f1e1ceea9cd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a49129ec-46ee-475e-a57d-a06dbd841222", "address": "fa:16:3e:e8:c3:22", "network": {"id": "3c325a97-6f74-4e81-8e34-a66452f159a5", "bridge": "br-int", "label": "tempest-network-smoke--1931288479", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa49129ec-46", "ovs_interfaceid": "a49129ec-46ee-475e-a57d-a06dbd841222", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 17 17:35:46 compute-0 nova_compute[186479]: 2026-02-17 17:35:46.173 186483 DEBUG nova.network.os_vif_util [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "a49129ec-46ee-475e-a57d-a06dbd841222", "address": "fa:16:3e:e8:c3:22", "network": {"id": "3c325a97-6f74-4e81-8e34-a66452f159a5", "bridge": "br-int", "label": "tempest-network-smoke--1931288479", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa49129ec-46", "ovs_interfaceid": "a49129ec-46ee-475e-a57d-a06dbd841222", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:35:46 compute-0 nova_compute[186479]: 2026-02-17 17:35:46.174 186483 DEBUG nova.network.os_vif_util [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e8:c3:22,bridge_name='br-int',has_traffic_filtering=True,id=a49129ec-46ee-475e-a57d-a06dbd841222,network=Network(3c325a97-6f74-4e81-8e34-a66452f159a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa49129ec-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:35:46 compute-0 nova_compute[186479]: 2026-02-17 17:35:46.174 186483 DEBUG os_vif [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:c3:22,bridge_name='br-int',has_traffic_filtering=True,id=a49129ec-46ee-475e-a57d-a06dbd841222,network=Network(3c325a97-6f74-4e81-8e34-a66452f159a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa49129ec-46') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 17 17:35:46 compute-0 nova_compute[186479]: 2026-02-17 17:35:46.176 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:46 compute-0 nova_compute[186479]: 2026-02-17 17:35:46.176 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa49129ec-46, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:35:46 compute-0 nova_compute[186479]: 2026-02-17 17:35:46.177 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:46 compute-0 nova_compute[186479]: 2026-02-17 17:35:46.183 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:35:46 compute-0 nova_compute[186479]: 2026-02-17 17:35:46.185 186483 INFO os_vif [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:c3:22,bridge_name='br-int',has_traffic_filtering=True,id=a49129ec-46ee-475e-a57d-a06dbd841222,network=Network(3c325a97-6f74-4e81-8e34-a66452f159a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa49129ec-46')
Feb 17 17:35:46 compute-0 nova_compute[186479]: 2026-02-17 17:35:46.186 186483 INFO nova.virt.libvirt.driver [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Deleting instance files /var/lib/nova/instances/7bc19460-80c1-4421-b690-1f1e1ceea9cd_del
Feb 17 17:35:46 compute-0 nova_compute[186479]: 2026-02-17 17:35:46.186 186483 INFO nova.virt.libvirt.driver [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Deletion of /var/lib/nova/instances/7bc19460-80c1-4421-b690-1f1e1ceea9cd_del complete
Feb 17 17:35:46 compute-0 podman[219451]: 2026-02-17 17:35:46.214733258 +0000 UTC m=+0.041253825 container remove 510730402c2cba757d490624da84bb19fcc9472c8474aaaf714dc01976e792ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 17 17:35:46 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:46.220 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[855c9867-a105-4ee0-81e1-62d81124d8dc]: (4, ('Tue Feb 17 05:35:46 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5 (510730402c2cba757d490624da84bb19fcc9472c8474aaaf714dc01976e792ac)\n510730402c2cba757d490624da84bb19fcc9472c8474aaaf714dc01976e792ac\nTue Feb 17 05:35:46 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5 (510730402c2cba757d490624da84bb19fcc9472c8474aaaf714dc01976e792ac)\n510730402c2cba757d490624da84bb19fcc9472c8474aaaf714dc01976e792ac\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:46 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:46.222 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[57a4694d-b46e-4175-b9d4-2fcc022e6ac7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:46 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:46.223 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c325a97-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:35:46 compute-0 nova_compute[186479]: 2026-02-17 17:35:46.225 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:46 compute-0 kernel: tap3c325a97-60: left promiscuous mode
Feb 17 17:35:46 compute-0 nova_compute[186479]: 2026-02-17 17:35:46.231 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:46 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:46.234 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[28ad8e94-d5c8-49ff-ae06-2db57f4cacf9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:46 compute-0 nova_compute[186479]: 2026-02-17 17:35:46.247 186483 INFO nova.compute.manager [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Took 0.35 seconds to destroy the instance on the hypervisor.
Feb 17 17:35:46 compute-0 nova_compute[186479]: 2026-02-17 17:35:46.248 186483 DEBUG oslo.service.loopingcall [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 17 17:35:46 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:46.249 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[04183021-bb71-454c-91da-94238b4629f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:46 compute-0 nova_compute[186479]: 2026-02-17 17:35:46.250 186483 DEBUG nova.compute.manager [-] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 17 17:35:46 compute-0 nova_compute[186479]: 2026-02-17 17:35:46.250 186483 DEBUG nova.network.neutron [-] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 17 17:35:46 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:46.250 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[947de34d-16b7-4fa3-bf2c-84e29c0cea2f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:46 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:46.270 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[feec32ed-b8a8-4695-8b82-7b53a19a9b28]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 347718, 'reachable_time': 39356, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219467, 'error': None, 'target': 'ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:46 compute-0 systemd[1]: run-netns-ovnmeta\x2d3c325a97\x2d6f74\x2d4e81\x2d8e34\x2da66452f159a5.mount: Deactivated successfully.
Feb 17 17:35:46 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:46.273 106424 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3c325a97-6f74-4e81-8e34-a66452f159a5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 17 17:35:46 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:46.273 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[7acce656-3f23-4210-9b13-37017241a26f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.416 186483 DEBUG nova.network.neutron [req-7a54eec9-0e02-4834-9f43-a41d55c2a25f req-fc62e9c7-4e1f-4d74-b3ef-57dd85379bb0 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Updated VIF entry in instance network info cache for port a49129ec-46ee-475e-a57d-a06dbd841222. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.417 186483 DEBUG nova.network.neutron [req-7a54eec9-0e02-4834-9f43-a41d55c2a25f req-fc62e9c7-4e1f-4d74-b3ef-57dd85379bb0 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Updating instance_info_cache with network_info: [{"id": "a49129ec-46ee-475e-a57d-a06dbd841222", "address": "fa:16:3e:e8:c3:22", "network": {"id": "3c325a97-6f74-4e81-8e34-a66452f159a5", "bridge": "br-int", "label": "tempest-network-smoke--1931288479", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa49129ec-46", "ovs_interfaceid": "a49129ec-46ee-475e-a57d-a06dbd841222", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.423 186483 DEBUG nova.network.neutron [-] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.446 186483 DEBUG oslo_concurrency.lockutils [req-7a54eec9-0e02-4834-9f43-a41d55c2a25f req-fc62e9c7-4e1f-4d74-b3ef-57dd85379bb0 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-7bc19460-80c1-4421-b690-1f1e1ceea9cd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.451 186483 INFO nova.compute.manager [-] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Took 1.20 seconds to deallocate network for instance.
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.493 186483 DEBUG oslo_concurrency.lockutils [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.494 186483 DEBUG oslo_concurrency.lockutils [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.564 186483 DEBUG nova.compute.provider_tree [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.579 186483 DEBUG nova.scheduler.client.report [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.600 186483 DEBUG oslo_concurrency.lockutils [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.623 186483 INFO nova.scheduler.client.report [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Deleted allocations for instance 7bc19460-80c1-4421-b690-1f1e1ceea9cd
Feb 17 17:35:47 compute-0 podman[219468]: 2026-02-17 17:35:47.723718318 +0000 UTC m=+0.057430771 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.838 186483 DEBUG oslo_concurrency.lockutils [None req-8cd00aac-8ad1-4f48-9f16-3c1550b2bc31 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.951s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.878 186483 DEBUG nova.compute.manager [req-63eb807a-a067-40ac-b0b9-850ef11bdbe9 req-64c64919-fd4d-4693-aa1c-57b14dff31f6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Received event network-vif-unplugged-a49129ec-46ee-475e-a57d-a06dbd841222 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.879 186483 DEBUG oslo_concurrency.lockutils [req-63eb807a-a067-40ac-b0b9-850ef11bdbe9 req-64c64919-fd4d-4693-aa1c-57b14dff31f6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.879 186483 DEBUG oslo_concurrency.lockutils [req-63eb807a-a067-40ac-b0b9-850ef11bdbe9 req-64c64919-fd4d-4693-aa1c-57b14dff31f6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.880 186483 DEBUG oslo_concurrency.lockutils [req-63eb807a-a067-40ac-b0b9-850ef11bdbe9 req-64c64919-fd4d-4693-aa1c-57b14dff31f6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.881 186483 DEBUG nova.compute.manager [req-63eb807a-a067-40ac-b0b9-850ef11bdbe9 req-64c64919-fd4d-4693-aa1c-57b14dff31f6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] No waiting events found dispatching network-vif-unplugged-a49129ec-46ee-475e-a57d-a06dbd841222 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.881 186483 WARNING nova.compute.manager [req-63eb807a-a067-40ac-b0b9-850ef11bdbe9 req-64c64919-fd4d-4693-aa1c-57b14dff31f6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Received unexpected event network-vif-unplugged-a49129ec-46ee-475e-a57d-a06dbd841222 for instance with vm_state deleted and task_state None.
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.881 186483 DEBUG nova.compute.manager [req-63eb807a-a067-40ac-b0b9-850ef11bdbe9 req-64c64919-fd4d-4693-aa1c-57b14dff31f6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Received event network-vif-plugged-a49129ec-46ee-475e-a57d-a06dbd841222 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.882 186483 DEBUG oslo_concurrency.lockutils [req-63eb807a-a067-40ac-b0b9-850ef11bdbe9 req-64c64919-fd4d-4693-aa1c-57b14dff31f6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.882 186483 DEBUG oslo_concurrency.lockutils [req-63eb807a-a067-40ac-b0b9-850ef11bdbe9 req-64c64919-fd4d-4693-aa1c-57b14dff31f6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.883 186483 DEBUG oslo_concurrency.lockutils [req-63eb807a-a067-40ac-b0b9-850ef11bdbe9 req-64c64919-fd4d-4693-aa1c-57b14dff31f6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "7bc19460-80c1-4421-b690-1f1e1ceea9cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.883 186483 DEBUG nova.compute.manager [req-63eb807a-a067-40ac-b0b9-850ef11bdbe9 req-64c64919-fd4d-4693-aa1c-57b14dff31f6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] No waiting events found dispatching network-vif-plugged-a49129ec-46ee-475e-a57d-a06dbd841222 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.884 186483 WARNING nova.compute.manager [req-63eb807a-a067-40ac-b0b9-850ef11bdbe9 req-64c64919-fd4d-4693-aa1c-57b14dff31f6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Received unexpected event network-vif-plugged-a49129ec-46ee-475e-a57d-a06dbd841222 for instance with vm_state deleted and task_state None.
Feb 17 17:35:47 compute-0 nova_compute[186479]: 2026-02-17 17:35:47.884 186483 DEBUG nova.compute.manager [req-63eb807a-a067-40ac-b0b9-850ef11bdbe9 req-64c64919-fd4d-4693-aa1c-57b14dff31f6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Received event network-vif-deleted-a49129ec-46ee-475e-a57d-a06dbd841222 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:35:50 compute-0 nova_compute[186479]: 2026-02-17 17:35:50.123 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:50 compute-0 nova_compute[186479]: 2026-02-17 17:35:50.134 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:50 compute-0 nova_compute[186479]: 2026-02-17 17:35:50.153 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:51 compute-0 nova_compute[186479]: 2026-02-17 17:35:51.178 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:51 compute-0 podman[219493]: 2026-02-17 17:35:51.713752629 +0000 UTC m=+0.055908775 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, version=9.7, config_id=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Feb 17 17:35:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:54.652 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '7a:bf:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:a1:c8:7d:f2:63'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:35:54 compute-0 nova_compute[186479]: 2026-02-17 17:35:54.652 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:54 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:54.654 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 17 17:35:55 compute-0 nova_compute[186479]: 2026-02-17 17:35:55.125 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:56 compute-0 nova_compute[186479]: 2026-02-17 17:35:56.181 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:35:58 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:35:58.657 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ee0cee2f-3200-4f1f-8903-57b18789347d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:35:59 compute-0 podman[219516]: 2026-02-17 17:35:59.706878972 +0000 UTC m=+0.052635457 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 17 17:35:59 compute-0 podman[219517]: 2026-02-17 17:35:59.728133899 +0000 UTC m=+0.064441099 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute)
Feb 17 17:36:00 compute-0 nova_compute[186479]: 2026-02-17 17:36:00.127 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:01 compute-0 nova_compute[186479]: 2026-02-17 17:36:01.152 186483 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771349746.1506639, 7bc19460-80c1-4421-b690-1f1e1ceea9cd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:36:01 compute-0 nova_compute[186479]: 2026-02-17 17:36:01.152 186483 INFO nova.compute.manager [-] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] VM Stopped (Lifecycle Event)
Feb 17 17:36:01 compute-0 nova_compute[186479]: 2026-02-17 17:36:01.179 186483 DEBUG nova.compute.manager [None req-e38d092f-9305-4d54-81ba-7e74b2ae3f62 - - - - - -] [instance: 7bc19460-80c1-4421-b690-1f1e1ceea9cd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:36:01 compute-0 nova_compute[186479]: 2026-02-17 17:36:01.183 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:01 compute-0 nova_compute[186479]: 2026-02-17 17:36:01.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:36:02 compute-0 nova_compute[186479]: 2026-02-17 17:36:02.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:36:03 compute-0 nova_compute[186479]: 2026-02-17 17:36:03.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:36:04 compute-0 nova_compute[186479]: 2026-02-17 17:36:04.298 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:36:04 compute-0 nova_compute[186479]: 2026-02-17 17:36:04.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:36:04 compute-0 nova_compute[186479]: 2026-02-17 17:36:04.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:36:04 compute-0 nova_compute[186479]: 2026-02-17 17:36:04.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:36:04 compute-0 nova_compute[186479]: 2026-02-17 17:36:04.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:36:05 compute-0 nova_compute[186479]: 2026-02-17 17:36:05.129 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:05 compute-0 podman[219552]: 2026-02-17 17:36:05.705414589 +0000 UTC m=+0.053653611 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 17 17:36:05 compute-0 nova_compute[186479]: 2026-02-17 17:36:05.859 186483 DEBUG oslo_concurrency.lockutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:36:05 compute-0 nova_compute[186479]: 2026-02-17 17:36:05.860 186483 DEBUG oslo_concurrency.lockutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:36:05 compute-0 nova_compute[186479]: 2026-02-17 17:36:05.879 186483 DEBUG nova.compute.manager [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 17 17:36:05 compute-0 nova_compute[186479]: 2026-02-17 17:36:05.970 186483 DEBUG oslo_concurrency.lockutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:36:05 compute-0 nova_compute[186479]: 2026-02-17 17:36:05.971 186483 DEBUG oslo_concurrency.lockutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:36:05 compute-0 nova_compute[186479]: 2026-02-17 17:36:05.979 186483 DEBUG nova.virt.hardware [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 17 17:36:05 compute-0 nova_compute[186479]: 2026-02-17 17:36:05.980 186483 INFO nova.compute.claims [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Claim successful on node compute-0.ctlplane.example.com
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.103 186483 DEBUG nova.compute.provider_tree [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.118 186483 DEBUG nova.scheduler.client.report [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.145 186483 DEBUG oslo_concurrency.lockutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.146 186483 DEBUG nova.compute.manager [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.186 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.194 186483 DEBUG nova.compute.manager [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.195 186483 DEBUG nova.network.neutron [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.221 186483 INFO nova.virt.libvirt.driver [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.249 186483 DEBUG nova.compute.manager [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.336 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.337 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.337 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.338 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.363 186483 DEBUG nova.compute.manager [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.366 186483 DEBUG nova.virt.libvirt.driver [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.366 186483 INFO nova.virt.libvirt.driver [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Creating image(s)
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.367 186483 DEBUG oslo_concurrency.lockutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "/var/lib/nova/instances/2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.368 186483 DEBUG oslo_concurrency.lockutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.370 186483 DEBUG oslo_concurrency.lockutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.400 186483 DEBUG nova.policy [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.404 186483 DEBUG oslo_concurrency.processutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.463 186483 DEBUG oslo_concurrency.processutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.464 186483 DEBUG oslo_concurrency.lockutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.465 186483 DEBUG oslo_concurrency.lockutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.480 186483 DEBUG oslo_concurrency.processutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.539 186483 DEBUG oslo_concurrency.processutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.542 186483 DEBUG oslo_concurrency.processutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.571 186483 DEBUG oslo_concurrency.processutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.572 186483 DEBUG oslo_concurrency.lockutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.573 186483 DEBUG oslo_concurrency.processutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.619 186483 DEBUG oslo_concurrency.processutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.621 186483 DEBUG nova.virt.disk.api [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Checking if we can resize image /var/lib/nova/instances/2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.621 186483 DEBUG oslo_concurrency.processutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.642 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.643 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5750MB free_disk=73.20693588256836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.644 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.644 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.661 186483 DEBUG oslo_concurrency.processutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.661 186483 DEBUG nova.virt.disk.api [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Cannot resize image /var/lib/nova/instances/2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.662 186483 DEBUG nova.objects.instance [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'migration_context' on Instance uuid 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.682 186483 DEBUG nova.virt.libvirt.driver [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.682 186483 DEBUG nova.virt.libvirt.driver [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Ensure instance console log exists: /var/lib/nova/instances/2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.683 186483 DEBUG oslo_concurrency.lockutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.683 186483 DEBUG oslo_concurrency.lockutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.684 186483 DEBUG oslo_concurrency.lockutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.704 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Instance 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.704 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.704 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.743 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.757 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.783 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:36:06 compute-0 nova_compute[186479]: 2026-02-17 17:36:06.783 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:36:07 compute-0 nova_compute[186479]: 2026-02-17 17:36:07.054 186483 DEBUG nova.network.neutron [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Successfully created port: 2c569f82-9221-46a0-b481-e5d95c02ed5c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 17 17:36:07 compute-0 nova_compute[186479]: 2026-02-17 17:36:07.778 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:36:10 compute-0 nova_compute[186479]: 2026-02-17 17:36:10.131 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:10.954 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:36:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:10.955 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:36:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:10.955 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:36:11 compute-0 nova_compute[186479]: 2026-02-17 17:36:11.189 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:11 compute-0 nova_compute[186479]: 2026-02-17 17:36:11.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:36:11 compute-0 nova_compute[186479]: 2026-02-17 17:36:11.303 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:36:11 compute-0 nova_compute[186479]: 2026-02-17 17:36:11.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 17 17:36:11 compute-0 nova_compute[186479]: 2026-02-17 17:36:11.460 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Feb 17 17:36:11 compute-0 nova_compute[186479]: 2026-02-17 17:36:11.460 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 17 17:36:11 compute-0 nova_compute[186479]: 2026-02-17 17:36:11.577 186483 DEBUG nova.network.neutron [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Successfully updated port: 2c569f82-9221-46a0-b481-e5d95c02ed5c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 17 17:36:11 compute-0 nova_compute[186479]: 2026-02-17 17:36:11.601 186483 DEBUG oslo_concurrency.lockutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "refresh_cache-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:36:11 compute-0 nova_compute[186479]: 2026-02-17 17:36:11.601 186483 DEBUG oslo_concurrency.lockutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquired lock "refresh_cache-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:36:11 compute-0 nova_compute[186479]: 2026-02-17 17:36:11.602 186483 DEBUG nova.network.neutron [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 17 17:36:11 compute-0 nova_compute[186479]: 2026-02-17 17:36:11.771 186483 DEBUG nova.network.neutron [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 17 17:36:11 compute-0 nova_compute[186479]: 2026-02-17 17:36:11.774 186483 DEBUG nova.compute.manager [req-f06b3e4e-b99e-40e1-9f4e-3ffa7e5dbba3 req-270d6a04-441e-4d4d-acd5-68662efc4e5a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Received event network-changed-2c569f82-9221-46a0-b481-e5d95c02ed5c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:36:11 compute-0 nova_compute[186479]: 2026-02-17 17:36:11.774 186483 DEBUG nova.compute.manager [req-f06b3e4e-b99e-40e1-9f4e-3ffa7e5dbba3 req-270d6a04-441e-4d4d-acd5-68662efc4e5a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Refreshing instance network info cache due to event network-changed-2c569f82-9221-46a0-b481-e5d95c02ed5c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:36:11 compute-0 nova_compute[186479]: 2026-02-17 17:36:11.774 186483 DEBUG oslo_concurrency.lockutils [req-f06b3e4e-b99e-40e1-9f4e-3ffa7e5dbba3 req-270d6a04-441e-4d4d-acd5-68662efc4e5a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.489 186483 DEBUG nova.network.neutron [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Updating instance_info_cache with network_info: [{"id": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "address": "fa:16:3e:00:92:e0", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c569f82-92", "ovs_interfaceid": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.511 186483 DEBUG oslo_concurrency.lockutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Releasing lock "refresh_cache-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.512 186483 DEBUG nova.compute.manager [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Instance network_info: |[{"id": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "address": "fa:16:3e:00:92:e0", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c569f82-92", "ovs_interfaceid": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.513 186483 DEBUG oslo_concurrency.lockutils [req-f06b3e4e-b99e-40e1-9f4e-3ffa7e5dbba3 req-270d6a04-441e-4d4d-acd5-68662efc4e5a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.513 186483 DEBUG nova.network.neutron [req-f06b3e4e-b99e-40e1-9f4e-3ffa7e5dbba3 req-270d6a04-441e-4d4d-acd5-68662efc4e5a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Refreshing network info cache for port 2c569f82-9221-46a0-b481-e5d95c02ed5c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.518 186483 DEBUG nova.virt.libvirt.driver [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Start _get_guest_xml network_info=[{"id": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "address": "fa:16:3e:00:92:e0", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c569f82-92", "ovs_interfaceid": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_format': None, 'encryption_options': None, 'size': 0, 'encrypted': False, 'image_id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.526 186483 WARNING nova.virt.libvirt.driver [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.536 186483 DEBUG nova.virt.libvirt.host [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.537 186483 DEBUG nova.virt.libvirt.host [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.542 186483 DEBUG nova.virt.libvirt.host [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.543 186483 DEBUG nova.virt.libvirt.host [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.544 186483 DEBUG nova.virt.libvirt.driver [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.544 186483 DEBUG nova.virt.hardware [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-17T17:28:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='5ebd41ec-8360-4181-bce3-0c0dc586cdb2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.545 186483 DEBUG nova.virt.hardware [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.546 186483 DEBUG nova.virt.hardware [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.546 186483 DEBUG nova.virt.hardware [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.547 186483 DEBUG nova.virt.hardware [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.547 186483 DEBUG nova.virt.hardware [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.548 186483 DEBUG nova.virt.hardware [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.548 186483 DEBUG nova.virt.hardware [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.549 186483 DEBUG nova.virt.hardware [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.549 186483 DEBUG nova.virt.hardware [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.550 186483 DEBUG nova.virt.hardware [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.558 186483 DEBUG nova.virt.libvirt.vif [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:36:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-583735795',display_name='tempest-TestNetworkBasicOps-server-583735795',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-583735795',id=11,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDZdxdh7Y8sCHr0ByIyzH8BM0AjjH4br5ABCZRoTjXvr3Yn+jp5fn1rx4ummIhhsikquoJBFsBIXS6y0HjczWSj9RZWPZhWjtz7yoGSiC0hUryculZGDU9ynXoq5gn5u4g==',key_name='tempest-TestNetworkBasicOps-1832315980',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-120jnykj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:36:06Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "address": "fa:16:3e:00:92:e0", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c569f82-92", "ovs_interfaceid": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.559 186483 DEBUG nova.network.os_vif_util [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "address": "fa:16:3e:00:92:e0", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c569f82-92", "ovs_interfaceid": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.560 186483 DEBUG nova.network.os_vif_util [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:92:e0,bridge_name='br-int',has_traffic_filtering=True,id=2c569f82-9221-46a0-b481-e5d95c02ed5c,network=Network(641e4c07-2901-48bf-a652-443b9ce7f994),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c569f82-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.561 186483 DEBUG nova.objects.instance [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'pci_devices' on Instance uuid 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.577 186483 DEBUG nova.virt.libvirt.driver [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] End _get_guest_xml xml=<domain type="kvm">
Feb 17 17:36:12 compute-0 nova_compute[186479]:   <uuid>2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf</uuid>
Feb 17 17:36:12 compute-0 nova_compute[186479]:   <name>instance-0000000b</name>
Feb 17 17:36:12 compute-0 nova_compute[186479]:   <memory>131072</memory>
Feb 17 17:36:12 compute-0 nova_compute[186479]:   <vcpu>1</vcpu>
Feb 17 17:36:12 compute-0 nova_compute[186479]:   <metadata>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <nova:name>tempest-TestNetworkBasicOps-server-583735795</nova:name>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <nova:creationTime>2026-02-17 17:36:12</nova:creationTime>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <nova:flavor name="m1.nano">
Feb 17 17:36:12 compute-0 nova_compute[186479]:         <nova:memory>128</nova:memory>
Feb 17 17:36:12 compute-0 nova_compute[186479]:         <nova:disk>1</nova:disk>
Feb 17 17:36:12 compute-0 nova_compute[186479]:         <nova:swap>0</nova:swap>
Feb 17 17:36:12 compute-0 nova_compute[186479]:         <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:36:12 compute-0 nova_compute[186479]:         <nova:vcpus>1</nova:vcpus>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       </nova:flavor>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <nova:owner>
Feb 17 17:36:12 compute-0 nova_compute[186479]:         <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:36:12 compute-0 nova_compute[186479]:         <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       </nova:owner>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <nova:ports>
Feb 17 17:36:12 compute-0 nova_compute[186479]:         <nova:port uuid="2c569f82-9221-46a0-b481-e5d95c02ed5c">
Feb 17 17:36:12 compute-0 nova_compute[186479]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:         </nova:port>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       </nova:ports>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     </nova:instance>
Feb 17 17:36:12 compute-0 nova_compute[186479]:   </metadata>
Feb 17 17:36:12 compute-0 nova_compute[186479]:   <sysinfo type="smbios">
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <system>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <entry name="manufacturer">RDO</entry>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <entry name="product">OpenStack Compute</entry>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <entry name="serial">2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf</entry>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <entry name="uuid">2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf</entry>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <entry name="family">Virtual Machine</entry>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     </system>
Feb 17 17:36:12 compute-0 nova_compute[186479]:   </sysinfo>
Feb 17 17:36:12 compute-0 nova_compute[186479]:   <os>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <boot dev="hd"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <smbios mode="sysinfo"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:   </os>
Feb 17 17:36:12 compute-0 nova_compute[186479]:   <features>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <acpi/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <apic/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <vmcoreinfo/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:   </features>
Feb 17 17:36:12 compute-0 nova_compute[186479]:   <clock offset="utc">
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <timer name="pit" tickpolicy="delay"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <timer name="hpet" present="no"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:   </clock>
Feb 17 17:36:12 compute-0 nova_compute[186479]:   <cpu mode="host-model" match="exact">
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <topology sockets="1" cores="1" threads="1"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:36:12 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <disk type="file" device="disk">
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <target dev="vda" bus="virtio"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <disk type="file" device="cdrom">
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <driver name="qemu" type="raw" cache="none"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.config"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <target dev="sda" bus="sata"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <interface type="ethernet">
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <mac address="fa:16:3e:00:92:e0"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <driver name="vhost" rx_queue_size="512"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <mtu size="1442"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <target dev="tap2c569f82-92"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <serial type="pty">
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <log file="/var/lib/nova/instances/2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/console.log" append="off"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     </serial>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <video>
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     </video>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <input type="tablet" bus="usb"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <rng model="virtio">
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <backend model="random">/dev/urandom</backend>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <controller type="usb" index="0"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     <memballoon model="virtio">
Feb 17 17:36:12 compute-0 nova_compute[186479]:       <stats period="10"/>
Feb 17 17:36:12 compute-0 nova_compute[186479]:     </memballoon>
Feb 17 17:36:12 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:36:12 compute-0 nova_compute[186479]: </domain>
Feb 17 17:36:12 compute-0 nova_compute[186479]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.578 186483 DEBUG nova.compute.manager [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Preparing to wait for external event network-vif-plugged-2c569f82-9221-46a0-b481-e5d95c02ed5c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.579 186483 DEBUG oslo_concurrency.lockutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.579 186483 DEBUG oslo_concurrency.lockutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.580 186483 DEBUG oslo_concurrency.lockutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.581 186483 DEBUG nova.virt.libvirt.vif [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:36:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-583735795',display_name='tempest-TestNetworkBasicOps-server-583735795',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-583735795',id=11,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDZdxdh7Y8sCHr0ByIyzH8BM0AjjH4br5ABCZRoTjXvr3Yn+jp5fn1rx4ummIhhsikquoJBFsBIXS6y0HjczWSj9RZWPZhWjtz7yoGSiC0hUryculZGDU9ynXoq5gn5u4g==',key_name='tempest-TestNetworkBasicOps-1832315980',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-120jnykj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:36:06Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "address": "fa:16:3e:00:92:e0", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c569f82-92", "ovs_interfaceid": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.581 186483 DEBUG nova.network.os_vif_util [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "address": "fa:16:3e:00:92:e0", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c569f82-92", "ovs_interfaceid": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.582 186483 DEBUG nova.network.os_vif_util [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:92:e0,bridge_name='br-int',has_traffic_filtering=True,id=2c569f82-9221-46a0-b481-e5d95c02ed5c,network=Network(641e4c07-2901-48bf-a652-443b9ce7f994),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c569f82-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.583 186483 DEBUG os_vif [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:92:e0,bridge_name='br-int',has_traffic_filtering=True,id=2c569f82-9221-46a0-b481-e5d95c02ed5c,network=Network(641e4c07-2901-48bf-a652-443b9ce7f994),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c569f82-92') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.584 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.584 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.585 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.588 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.589 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2c569f82-92, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.589 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2c569f82-92, col_values=(('external_ids', {'iface-id': '2c569f82-9221-46a0-b481-e5d95c02ed5c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:00:92:e0', 'vm-uuid': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.591 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:12 compute-0 NetworkManager[56323]: <info>  [1771349772.5921] manager: (tap2c569f82-92): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.593 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.599 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.601 186483 INFO os_vif [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:92:e0,bridge_name='br-int',has_traffic_filtering=True,id=2c569f82-9221-46a0-b481-e5d95c02ed5c,network=Network(641e4c07-2901-48bf-a652-443b9ce7f994),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c569f82-92')
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.655 186483 DEBUG nova.virt.libvirt.driver [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.655 186483 DEBUG nova.virt.libvirt.driver [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.656 186483 DEBUG nova.virt.libvirt.driver [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No VIF found with MAC fa:16:3e:00:92:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 17 17:36:12 compute-0 nova_compute[186479]: 2026-02-17 17:36:12.657 186483 INFO nova.virt.libvirt.driver [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Using config drive
Feb 17 17:36:13 compute-0 podman[219595]: 2026-02-17 17:36:13.792216615 +0000 UTC m=+0.126958900 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3)
Feb 17 17:36:14 compute-0 nova_compute[186479]: 2026-02-17 17:36:14.376 186483 INFO nova.virt.libvirt.driver [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Creating config drive at /var/lib/nova/instances/2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.config
Feb 17 17:36:14 compute-0 nova_compute[186479]: 2026-02-17 17:36:14.379 186483 DEBUG oslo_concurrency.processutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp5aja0omf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:36:14 compute-0 nova_compute[186479]: 2026-02-17 17:36:14.496 186483 DEBUG oslo_concurrency.processutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp5aja0omf" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:36:14 compute-0 kernel: tap2c569f82-92: entered promiscuous mode
Feb 17 17:36:14 compute-0 NetworkManager[56323]: <info>  [1771349774.5626] manager: (tap2c569f82-92): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Feb 17 17:36:14 compute-0 ovn_controller[96568]: 2026-02-17T17:36:14Z|00144|binding|INFO|Claiming lport 2c569f82-9221-46a0-b481-e5d95c02ed5c for this chassis.
Feb 17 17:36:14 compute-0 ovn_controller[96568]: 2026-02-17T17:36:14Z|00145|binding|INFO|2c569f82-9221-46a0-b481-e5d95c02ed5c: Claiming fa:16:3e:00:92:e0 10.100.0.11
Feb 17 17:36:14 compute-0 nova_compute[186479]: 2026-02-17 17:36:14.564 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:14 compute-0 nova_compute[186479]: 2026-02-17 17:36:14.574 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.581 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:92:e0 10.100.0.11'], port_security=['fa:16:3e:00:92:e0 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-641e4c07-2901-48bf-a652-443b9ce7f994', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fbda417a-4d86-41ac-b08e-f696e30a840a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9416c76a-6d88-4459-97f8-aa5f83d3ca8b, chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=2c569f82-9221-46a0-b481-e5d95c02ed5c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.583 105898 INFO neutron.agent.ovn.metadata.agent [-] Port 2c569f82-9221-46a0-b481-e5d95c02ed5c in datapath 641e4c07-2901-48bf-a652-443b9ce7f994 bound to our chassis
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.584 105898 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 641e4c07-2901-48bf-a652-443b9ce7f994
Feb 17 17:36:14 compute-0 systemd-udevd[219639]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:36:14 compute-0 systemd-machined[155877]: New machine qemu-11-instance-0000000b.
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.593 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[51fd1dbf-e90b-4025-81b0-c4dd0d61148c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.594 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap641e4c07-21 in ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.595 215208 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap641e4c07-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.595 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[111674c7-dca2-4168-941f-8f35daa240e8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.596 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[ae042d39-340d-45f2-979f-9a910c16959a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:36:14 compute-0 NetworkManager[56323]: <info>  [1771349774.6043] device (tap2c569f82-92): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 17 17:36:14 compute-0 NetworkManager[56323]: <info>  [1771349774.6051] device (tap2c569f82-92): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.606 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[5100a5db-0638-4ff0-a1e6-e0c84c506373]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:36:14 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000b.
Feb 17 17:36:14 compute-0 ovn_controller[96568]: 2026-02-17T17:36:14Z|00146|binding|INFO|Setting lport 2c569f82-9221-46a0-b481-e5d95c02ed5c ovn-installed in OVS
Feb 17 17:36:14 compute-0 ovn_controller[96568]: 2026-02-17T17:36:14Z|00147|binding|INFO|Setting lport 2c569f82-9221-46a0-b481-e5d95c02ed5c up in Southbound
Feb 17 17:36:14 compute-0 nova_compute[186479]: 2026-02-17 17:36:14.612 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.618 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[8f605c11-e0b8-45a0-96c1-410a3bc1f363]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.640 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[4582495f-f81d-4171-9823-80e8dd10e95c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:36:14 compute-0 systemd-udevd[219643]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:36:14 compute-0 NetworkManager[56323]: <info>  [1771349774.6489] manager: (tap641e4c07-20): new Veth device (/org/freedesktop/NetworkManager/Devices/80)
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.648 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[951310e5-b308-4112-b9d6-72fc74db9fd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.673 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[b6bbe9a7-0190-40e2-9e95-c8481afb1610]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.676 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[bf9cff7a-da74-4159-a900-5c780b209389]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:36:14 compute-0 NetworkManager[56323]: <info>  [1771349774.6907] device (tap641e4c07-20): carrier: link connected
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.693 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[b7c57aff-c35c-4479-8c2d-87aa2bf3a0e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.705 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[f408747a-8001-41b8-a126-592ff93f8450]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap641e4c07-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:26:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353000, 'reachable_time': 20104, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219672, 'error': None, 'target': 'ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.718 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[ac73ae99-ce5e-488a-8779-56d8757dbb37]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6f:2695'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353000, 'tstamp': 353000}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219673, 'error': None, 'target': 'ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.730 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[1fc47b8a-bbe1-4ea7-91ca-c6e774f6d2e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap641e4c07-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:26:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353000, 'reachable_time': 20104, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219674, 'error': None, 'target': 'ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.755 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[04073d76-9572-4bef-996a-e7d3a9687d75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.792 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[c537b813-fc80-48ec-93a2-4bfac391c524]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.793 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap641e4c07-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.794 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.795 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap641e4c07-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:36:14 compute-0 kernel: tap641e4c07-20: entered promiscuous mode
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.838 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap641e4c07-20, col_values=(('external_ids', {'iface-id': 'c7627b50-72b7-4f6f-a9ad-2ceb9b8b24e3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:36:14 compute-0 nova_compute[186479]: 2026-02-17 17:36:14.836 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:14 compute-0 NetworkManager[56323]: <info>  [1771349774.8402] manager: (tap641e4c07-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Feb 17 17:36:14 compute-0 nova_compute[186479]: 2026-02-17 17:36:14.839 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:14 compute-0 ovn_controller[96568]: 2026-02-17T17:36:14Z|00148|binding|INFO|Releasing lport c7627b50-72b7-4f6f-a9ad-2ceb9b8b24e3 from this chassis (sb_readonly=0)
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.843 105898 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/641e4c07-2901-48bf-a652-443b9ce7f994.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/641e4c07-2901-48bf-a652-443b9ce7f994.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 17 17:36:14 compute-0 nova_compute[186479]: 2026-02-17 17:36:14.843 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.847 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[31f9b8ba-de1e-4049-8d49-42843a3d65d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:36:14 compute-0 nova_compute[186479]: 2026-02-17 17:36:14.846 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.847 105898 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: global
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:     log         /dev/log local0 debug
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:     log-tag     haproxy-metadata-proxy-641e4c07-2901-48bf-a652-443b9ce7f994
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:     user        root
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:     group       root
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:     maxconn     1024
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:     pidfile     /var/lib/neutron/external/pids/641e4c07-2901-48bf-a652-443b9ce7f994.pid.haproxy
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:     daemon
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: defaults
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:     log global
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:     mode http
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:     option httplog
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:     option dontlognull
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:     option http-server-close
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:     option forwardfor
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:     retries                 3
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:     timeout http-request    30s
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:     timeout connect         30s
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:     timeout client          32s
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:     timeout server          32s
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:     timeout http-keep-alive 30s
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: listen listener
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:     bind 169.254.169.254:80
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:     server metadata /var/lib/neutron/metadata_proxy
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:     http-request add-header X-OVN-Network-ID 641e4c07-2901-48bf-a652-443b9ce7f994
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 17 17:36:14 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:14.849 105898 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994', 'env', 'PROCESS_TAG=haproxy-641e4c07-2901-48bf-a652-443b9ce7f994', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/641e4c07-2901-48bf-a652-443b9ce7f994.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.130 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349775.129153, 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.131 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] VM Started (Lifecycle Event)
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.133 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.150 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.155 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349775.1333888, 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.155 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] VM Paused (Lifecycle Event)
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.173 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.177 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.194 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:36:15 compute-0 podman[219713]: 2026-02-17 17:36:15.2159785 +0000 UTC m=+0.047479322 container create 3bee9a3f51c5e1bb520a18b226005a347978ea0b7e6749664069d40bd9993c08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:36:15 compute-0 systemd[1]: Started libpod-conmon-3bee9a3f51c5e1bb520a18b226005a347978ea0b7e6749664069d40bd9993c08.scope.
Feb 17 17:36:15 compute-0 podman[219713]: 2026-02-17 17:36:15.187144043 +0000 UTC m=+0.018644885 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 17 17:36:15 compute-0 systemd[1]: Started libcrun container.
Feb 17 17:36:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b006fb4561ad72bc8c56752cdc428a29754f8d013dcefd51e2976c4f04f68945/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 17 17:36:15 compute-0 podman[219713]: 2026-02-17 17:36:15.313369934 +0000 UTC m=+0.144870756 container init 3bee9a3f51c5e1bb520a18b226005a347978ea0b7e6749664069d40bd9993c08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 17 17:36:15 compute-0 podman[219713]: 2026-02-17 17:36:15.319021739 +0000 UTC m=+0.150522551 container start 3bee9a3f51c5e1bb520a18b226005a347978ea0b7e6749664069d40bd9993c08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3)
Feb 17 17:36:15 compute-0 neutron-haproxy-ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994[219730]: [NOTICE]   (219734) : New worker (219736) forked
Feb 17 17:36:15 compute-0 neutron-haproxy-ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994[219730]: [NOTICE]   (219734) : Loading success.
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.553 186483 DEBUG nova.compute.manager [req-e09a5d2c-dc67-4640-a6c8-e9850031afb3 req-ed853d6e-415b-4736-93f5-0d3398783845 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Received event network-vif-plugged-2c569f82-9221-46a0-b481-e5d95c02ed5c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.554 186483 DEBUG oslo_concurrency.lockutils [req-e09a5d2c-dc67-4640-a6c8-e9850031afb3 req-ed853d6e-415b-4736-93f5-0d3398783845 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.555 186483 DEBUG oslo_concurrency.lockutils [req-e09a5d2c-dc67-4640-a6c8-e9850031afb3 req-ed853d6e-415b-4736-93f5-0d3398783845 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.555 186483 DEBUG oslo_concurrency.lockutils [req-e09a5d2c-dc67-4640-a6c8-e9850031afb3 req-ed853d6e-415b-4736-93f5-0d3398783845 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.556 186483 DEBUG nova.compute.manager [req-e09a5d2c-dc67-4640-a6c8-e9850031afb3 req-ed853d6e-415b-4736-93f5-0d3398783845 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Processing event network-vif-plugged-2c569f82-9221-46a0-b481-e5d95c02ed5c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.557 186483 DEBUG nova.compute.manager [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.562 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349775.5618663, 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.562 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] VM Resumed (Lifecycle Event)
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.566 186483 DEBUG nova.virt.libvirt.driver [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.570 186483 INFO nova.virt.libvirt.driver [-] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Instance spawned successfully.
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.570 186483 DEBUG nova.virt.libvirt.driver [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.586 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.599 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.607 186483 DEBUG nova.virt.libvirt.driver [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.607 186483 DEBUG nova.virt.libvirt.driver [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.608 186483 DEBUG nova.virt.libvirt.driver [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.609 186483 DEBUG nova.virt.libvirt.driver [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.610 186483 DEBUG nova.virt.libvirt.driver [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.610 186483 DEBUG nova.virt.libvirt.driver [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.621 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.678 186483 INFO nova.compute.manager [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Took 9.31 seconds to spawn the instance on the hypervisor.
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.679 186483 DEBUG nova.compute.manager [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.743 186483 INFO nova.compute.manager [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Took 9.80 seconds to build instance.
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.757 186483 DEBUG oslo_concurrency.lockutils [None req-d602a0d2-3346-4b39-8ba9-311380d0bf2c 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.896s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.816 186483 DEBUG nova.network.neutron [req-f06b3e4e-b99e-40e1-9f4e-3ffa7e5dbba3 req-270d6a04-441e-4d4d-acd5-68662efc4e5a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Updated VIF entry in instance network info cache for port 2c569f82-9221-46a0-b481-e5d95c02ed5c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.817 186483 DEBUG nova.network.neutron [req-f06b3e4e-b99e-40e1-9f4e-3ffa7e5dbba3 req-270d6a04-441e-4d4d-acd5-68662efc4e5a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Updating instance_info_cache with network_info: [{"id": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "address": "fa:16:3e:00:92:e0", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c569f82-92", "ovs_interfaceid": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:36:15 compute-0 nova_compute[186479]: 2026-02-17 17:36:15.829 186483 DEBUG oslo_concurrency.lockutils [req-f06b3e4e-b99e-40e1-9f4e-3ffa7e5dbba3 req-270d6a04-441e-4d4d-acd5-68662efc4e5a 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:36:17 compute-0 nova_compute[186479]: 2026-02-17 17:36:17.591 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:17 compute-0 nova_compute[186479]: 2026-02-17 17:36:17.622 186483 DEBUG nova.compute.manager [req-d7975d95-f1de-4375-a793-ae4db5aa0182 req-de54207e-4d98-4be1-ad95-291f84f3521c 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Received event network-vif-plugged-2c569f82-9221-46a0-b481-e5d95c02ed5c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:36:17 compute-0 nova_compute[186479]: 2026-02-17 17:36:17.623 186483 DEBUG oslo_concurrency.lockutils [req-d7975d95-f1de-4375-a793-ae4db5aa0182 req-de54207e-4d98-4be1-ad95-291f84f3521c 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:36:17 compute-0 nova_compute[186479]: 2026-02-17 17:36:17.623 186483 DEBUG oslo_concurrency.lockutils [req-d7975d95-f1de-4375-a793-ae4db5aa0182 req-de54207e-4d98-4be1-ad95-291f84f3521c 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:36:17 compute-0 nova_compute[186479]: 2026-02-17 17:36:17.624 186483 DEBUG oslo_concurrency.lockutils [req-d7975d95-f1de-4375-a793-ae4db5aa0182 req-de54207e-4d98-4be1-ad95-291f84f3521c 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:36:17 compute-0 nova_compute[186479]: 2026-02-17 17:36:17.624 186483 DEBUG nova.compute.manager [req-d7975d95-f1de-4375-a793-ae4db5aa0182 req-de54207e-4d98-4be1-ad95-291f84f3521c 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] No waiting events found dispatching network-vif-plugged-2c569f82-9221-46a0-b481-e5d95c02ed5c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:36:17 compute-0 nova_compute[186479]: 2026-02-17 17:36:17.625 186483 WARNING nova.compute.manager [req-d7975d95-f1de-4375-a793-ae4db5aa0182 req-de54207e-4d98-4be1-ad95-291f84f3521c 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Received unexpected event network-vif-plugged-2c569f82-9221-46a0-b481-e5d95c02ed5c for instance with vm_state active and task_state None.
Feb 17 17:36:18 compute-0 podman[219745]: 2026-02-17 17:36:18.728355895 +0000 UTC m=+0.060664678 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 17 17:36:20 compute-0 nova_compute[186479]: 2026-02-17 17:36:20.135 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:20 compute-0 NetworkManager[56323]: <info>  [1771349780.5436] manager: (patch-br-int-to-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Feb 17 17:36:20 compute-0 NetworkManager[56323]: <info>  [1771349780.5465] manager: (patch-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Feb 17 17:36:20 compute-0 nova_compute[186479]: 2026-02-17 17:36:20.542 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:20 compute-0 ovn_controller[96568]: 2026-02-17T17:36:20Z|00149|binding|INFO|Releasing lport c7627b50-72b7-4f6f-a9ad-2ceb9b8b24e3 from this chassis (sb_readonly=0)
Feb 17 17:36:20 compute-0 nova_compute[186479]: 2026-02-17 17:36:20.568 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:20 compute-0 ovn_controller[96568]: 2026-02-17T17:36:20Z|00150|binding|INFO|Releasing lport c7627b50-72b7-4f6f-a9ad-2ceb9b8b24e3 from this chassis (sb_readonly=0)
Feb 17 17:36:20 compute-0 nova_compute[186479]: 2026-02-17 17:36:20.577 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:21 compute-0 nova_compute[186479]: 2026-02-17 17:36:21.499 186483 DEBUG nova.compute.manager [req-a1bd7f68-4932-49d7-9cae-6a0fd3f9c563 req-fad1674d-caa3-4814-a955-5b4b301cb132 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Received event network-changed-2c569f82-9221-46a0-b481-e5d95c02ed5c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:36:21 compute-0 nova_compute[186479]: 2026-02-17 17:36:21.500 186483 DEBUG nova.compute.manager [req-a1bd7f68-4932-49d7-9cae-6a0fd3f9c563 req-fad1674d-caa3-4814-a955-5b4b301cb132 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Refreshing instance network info cache due to event network-changed-2c569f82-9221-46a0-b481-e5d95c02ed5c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:36:21 compute-0 nova_compute[186479]: 2026-02-17 17:36:21.501 186483 DEBUG oslo_concurrency.lockutils [req-a1bd7f68-4932-49d7-9cae-6a0fd3f9c563 req-fad1674d-caa3-4814-a955-5b4b301cb132 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:36:21 compute-0 nova_compute[186479]: 2026-02-17 17:36:21.502 186483 DEBUG oslo_concurrency.lockutils [req-a1bd7f68-4932-49d7-9cae-6a0fd3f9c563 req-fad1674d-caa3-4814-a955-5b4b301cb132 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:36:21 compute-0 nova_compute[186479]: 2026-02-17 17:36:21.502 186483 DEBUG nova.network.neutron [req-a1bd7f68-4932-49d7-9cae-6a0fd3f9c563 req-fad1674d-caa3-4814-a955-5b4b301cb132 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Refreshing network info cache for port 2c569f82-9221-46a0-b481-e5d95c02ed5c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:36:22 compute-0 nova_compute[186479]: 2026-02-17 17:36:22.594 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:22 compute-0 podman[219771]: 2026-02-17 17:36:22.741928557 +0000 UTC m=+0.078744580 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1770267347, version=9.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc.)
Feb 17 17:36:22 compute-0 nova_compute[186479]: 2026-02-17 17:36:22.953 186483 DEBUG nova.network.neutron [req-a1bd7f68-4932-49d7-9cae-6a0fd3f9c563 req-fad1674d-caa3-4814-a955-5b4b301cb132 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Updated VIF entry in instance network info cache for port 2c569f82-9221-46a0-b481-e5d95c02ed5c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:36:22 compute-0 nova_compute[186479]: 2026-02-17 17:36:22.954 186483 DEBUG nova.network.neutron [req-a1bd7f68-4932-49d7-9cae-6a0fd3f9c563 req-fad1674d-caa3-4814-a955-5b4b301cb132 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Updating instance_info_cache with network_info: [{"id": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "address": "fa:16:3e:00:92:e0", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c569f82-92", "ovs_interfaceid": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:36:22 compute-0 nova_compute[186479]: 2026-02-17 17:36:22.971 186483 DEBUG oslo_concurrency.lockutils [req-a1bd7f68-4932-49d7-9cae-6a0fd3f9c563 req-fad1674d-caa3-4814-a955-5b4b301cb132 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:36:25 compute-0 nova_compute[186479]: 2026-02-17 17:36:25.137 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:26 compute-0 ovn_controller[96568]: 2026-02-17T17:36:26Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:00:92:e0 10.100.0.11
Feb 17 17:36:26 compute-0 ovn_controller[96568]: 2026-02-17T17:36:26Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:00:92:e0 10.100.0.11
Feb 17 17:36:27 compute-0 nova_compute[186479]: 2026-02-17 17:36:27.598 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.090 186483 DEBUG oslo_concurrency.lockutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "db5b187e-9b4b-4bcf-a142-465cac63f18a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.091 186483 DEBUG oslo_concurrency.lockutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "db5b187e-9b4b-4bcf-a142-465cac63f18a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.109 186483 DEBUG nova.compute.manager [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.238 186483 DEBUG oslo_concurrency.lockutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.239 186483 DEBUG oslo_concurrency.lockutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.250 186483 DEBUG nova.virt.hardware [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.251 186483 INFO nova.compute.claims [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Claim successful on node compute-0.ctlplane.example.com
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.369 186483 DEBUG nova.compute.provider_tree [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.387 186483 DEBUG nova.scheduler.client.report [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.412 186483 DEBUG oslo_concurrency.lockutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.413 186483 DEBUG nova.compute.manager [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.475 186483 DEBUG nova.compute.manager [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.476 186483 DEBUG nova.network.neutron [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.495 186483 INFO nova.virt.libvirt.driver [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.514 186483 DEBUG nova.compute.manager [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.606 186483 DEBUG nova.compute.manager [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.608 186483 DEBUG nova.virt.libvirt.driver [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.608 186483 INFO nova.virt.libvirt.driver [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Creating image(s)
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.609 186483 DEBUG oslo_concurrency.lockutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "/var/lib/nova/instances/db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.610 186483 DEBUG oslo_concurrency.lockutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.610 186483 DEBUG oslo_concurrency.lockutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.627 186483 DEBUG oslo_concurrency.processutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.708 186483 DEBUG oslo_concurrency.processutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.709 186483 DEBUG oslo_concurrency.lockutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.709 186483 DEBUG oslo_concurrency.lockutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.734 186483 DEBUG oslo_concurrency.processutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.793 186483 DEBUG oslo_concurrency.processutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.794 186483 DEBUG oslo_concurrency.processutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/db5b187e-9b4b-4bcf-a142-465cac63f18a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.827 186483 DEBUG oslo_concurrency.processutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/db5b187e-9b4b-4bcf-a142-465cac63f18a/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.828 186483 DEBUG oslo_concurrency.lockutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.828 186483 DEBUG oslo_concurrency.processutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.878 186483 DEBUG oslo_concurrency.processutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.880 186483 DEBUG nova.virt.disk.api [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Checking if we can resize image /var/lib/nova/instances/db5b187e-9b4b-4bcf-a142-465cac63f18a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.880 186483 DEBUG oslo_concurrency.processutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db5b187e-9b4b-4bcf-a142-465cac63f18a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.925 186483 DEBUG oslo_concurrency.processutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db5b187e-9b4b-4bcf-a142-465cac63f18a/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.926 186483 DEBUG nova.virt.disk.api [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Cannot resize image /var/lib/nova/instances/db5b187e-9b4b-4bcf-a142-465cac63f18a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.926 186483 DEBUG nova.objects.instance [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'migration_context' on Instance uuid db5b187e-9b4b-4bcf-a142-465cac63f18a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.946 186483 DEBUG nova.virt.libvirt.driver [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.947 186483 DEBUG nova.virt.libvirt.driver [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Ensure instance console log exists: /var/lib/nova/instances/db5b187e-9b4b-4bcf-a142-465cac63f18a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.948 186483 DEBUG oslo_concurrency.lockutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.948 186483 DEBUG oslo_concurrency.lockutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:36:28 compute-0 nova_compute[186479]: 2026-02-17 17:36:28.948 186483 DEBUG oslo_concurrency.lockutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:36:29 compute-0 nova_compute[186479]: 2026-02-17 17:36:29.455 186483 DEBUG nova.policy [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 17 17:36:30 compute-0 nova_compute[186479]: 2026-02-17 17:36:30.140 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:30 compute-0 podman[219825]: 2026-02-17 17:36:30.712826567 +0000 UTC m=+0.059404049 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 17 17:36:30 compute-0 podman[219826]: 2026-02-17 17:36:30.7503151 +0000 UTC m=+0.085543921 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 17 17:36:31 compute-0 nova_compute[186479]: 2026-02-17 17:36:31.392 186483 DEBUG nova.network.neutron [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Successfully created port: e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 17 17:36:32 compute-0 nova_compute[186479]: 2026-02-17 17:36:32.601 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:32 compute-0 nova_compute[186479]: 2026-02-17 17:36:32.789 186483 DEBUG nova.network.neutron [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Successfully updated port: e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 17 17:36:32 compute-0 nova_compute[186479]: 2026-02-17 17:36:32.807 186483 DEBUG oslo_concurrency.lockutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "refresh_cache-db5b187e-9b4b-4bcf-a142-465cac63f18a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:36:32 compute-0 nova_compute[186479]: 2026-02-17 17:36:32.808 186483 DEBUG oslo_concurrency.lockutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquired lock "refresh_cache-db5b187e-9b4b-4bcf-a142-465cac63f18a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:36:32 compute-0 nova_compute[186479]: 2026-02-17 17:36:32.808 186483 DEBUG nova.network.neutron [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 17 17:36:32 compute-0 nova_compute[186479]: 2026-02-17 17:36:32.881 186483 DEBUG nova.compute.manager [req-6891deb0-d34f-4549-9947-8c2f2f2407d1 req-98f61188-e5a4-4e0f-86f2-1ee5e7fdc775 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Received event network-changed-e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:36:32 compute-0 nova_compute[186479]: 2026-02-17 17:36:32.881 186483 DEBUG nova.compute.manager [req-6891deb0-d34f-4549-9947-8c2f2f2407d1 req-98f61188-e5a4-4e0f-86f2-1ee5e7fdc775 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Refreshing instance network info cache due to event network-changed-e70422a1-9986-4f6b-b8a3-4c7f3a7d7710. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:36:32 compute-0 nova_compute[186479]: 2026-02-17 17:36:32.882 186483 DEBUG oslo_concurrency.lockutils [req-6891deb0-d34f-4549-9947-8c2f2f2407d1 req-98f61188-e5a4-4e0f-86f2-1ee5e7fdc775 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-db5b187e-9b4b-4bcf-a142-465cac63f18a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:36:33 compute-0 nova_compute[186479]: 2026-02-17 17:36:33.368 186483 DEBUG nova.network.neutron [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.458 186483 DEBUG nova.network.neutron [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Updating instance_info_cache with network_info: [{"id": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "address": "fa:16:3e:29:90:91", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape70422a1-99", "ovs_interfaceid": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.489 186483 DEBUG oslo_concurrency.lockutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Releasing lock "refresh_cache-db5b187e-9b4b-4bcf-a142-465cac63f18a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.489 186483 DEBUG nova.compute.manager [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Instance network_info: |[{"id": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "address": "fa:16:3e:29:90:91", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape70422a1-99", "ovs_interfaceid": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.490 186483 DEBUG oslo_concurrency.lockutils [req-6891deb0-d34f-4549-9947-8c2f2f2407d1 req-98f61188-e5a4-4e0f-86f2-1ee5e7fdc775 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-db5b187e-9b4b-4bcf-a142-465cac63f18a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.490 186483 DEBUG nova.network.neutron [req-6891deb0-d34f-4549-9947-8c2f2f2407d1 req-98f61188-e5a4-4e0f-86f2-1ee5e7fdc775 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Refreshing network info cache for port e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.493 186483 DEBUG nova.virt.libvirt.driver [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Start _get_guest_xml network_info=[{"id": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "address": "fa:16:3e:29:90:91", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape70422a1-99", "ovs_interfaceid": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_format': None, 'encryption_options': None, 'size': 0, 'encrypted': False, 'image_id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.497 186483 WARNING nova.virt.libvirt.driver [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.502 186483 DEBUG nova.virt.libvirt.host [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.503 186483 DEBUG nova.virt.libvirt.host [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.507 186483 DEBUG nova.virt.libvirt.host [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.508 186483 DEBUG nova.virt.libvirt.host [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.508 186483 DEBUG nova.virt.libvirt.driver [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.508 186483 DEBUG nova.virt.hardware [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-17T17:28:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='5ebd41ec-8360-4181-bce3-0c0dc586cdb2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.509 186483 DEBUG nova.virt.hardware [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.509 186483 DEBUG nova.virt.hardware [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.509 186483 DEBUG nova.virt.hardware [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.509 186483 DEBUG nova.virt.hardware [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.509 186483 DEBUG nova.virt.hardware [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.509 186483 DEBUG nova.virt.hardware [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.510 186483 DEBUG nova.virt.hardware [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.510 186483 DEBUG nova.virt.hardware [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.510 186483 DEBUG nova.virt.hardware [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.510 186483 DEBUG nova.virt.hardware [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.513 186483 DEBUG nova.virt.libvirt.vif [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:36:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1141949017',display_name='tempest-TestNetworkBasicOps-server-1141949017',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1141949017',id=12,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMY9kFaPF0ad6/uB/CqSdqudglFsmVzh3fOsOsHzCokheVVxikzrTKMeUfSPm+xkyFvcPpj3vWQpfoC3CKEz3KEMBROnXSpQ3X6qvEUlwaD4nCaIZGb/XIZnqeudDtBPCA==',key_name='tempest-TestNetworkBasicOps-1963528792',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-nzibojz0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:36:28Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=db5b187e-9b4b-4bcf-a142-465cac63f18a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "address": "fa:16:3e:29:90:91", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape70422a1-99", "ovs_interfaceid": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.514 186483 DEBUG nova.network.os_vif_util [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "address": "fa:16:3e:29:90:91", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape70422a1-99", "ovs_interfaceid": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.514 186483 DEBUG nova.network.os_vif_util [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:29:90:91,bridge_name='br-int',has_traffic_filtering=True,id=e70422a1-9986-4f6b-b8a3-4c7f3a7d7710,network=Network(641e4c07-2901-48bf-a652-443b9ce7f994),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape70422a1-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.515 186483 DEBUG nova.objects.instance [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'pci_devices' on Instance uuid db5b187e-9b4b-4bcf-a142-465cac63f18a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.531 186483 DEBUG nova.virt.libvirt.driver [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] End _get_guest_xml xml=<domain type="kvm">
Feb 17 17:36:34 compute-0 nova_compute[186479]:   <uuid>db5b187e-9b4b-4bcf-a142-465cac63f18a</uuid>
Feb 17 17:36:34 compute-0 nova_compute[186479]:   <name>instance-0000000c</name>
Feb 17 17:36:34 compute-0 nova_compute[186479]:   <memory>131072</memory>
Feb 17 17:36:34 compute-0 nova_compute[186479]:   <vcpu>1</vcpu>
Feb 17 17:36:34 compute-0 nova_compute[186479]:   <metadata>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <nova:name>tempest-TestNetworkBasicOps-server-1141949017</nova:name>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <nova:creationTime>2026-02-17 17:36:34</nova:creationTime>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <nova:flavor name="m1.nano">
Feb 17 17:36:34 compute-0 nova_compute[186479]:         <nova:memory>128</nova:memory>
Feb 17 17:36:34 compute-0 nova_compute[186479]:         <nova:disk>1</nova:disk>
Feb 17 17:36:34 compute-0 nova_compute[186479]:         <nova:swap>0</nova:swap>
Feb 17 17:36:34 compute-0 nova_compute[186479]:         <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:36:34 compute-0 nova_compute[186479]:         <nova:vcpus>1</nova:vcpus>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       </nova:flavor>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <nova:owner>
Feb 17 17:36:34 compute-0 nova_compute[186479]:         <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:36:34 compute-0 nova_compute[186479]:         <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       </nova:owner>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <nova:ports>
Feb 17 17:36:34 compute-0 nova_compute[186479]:         <nova:port uuid="e70422a1-9986-4f6b-b8a3-4c7f3a7d7710">
Feb 17 17:36:34 compute-0 nova_compute[186479]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:         </nova:port>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       </nova:ports>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     </nova:instance>
Feb 17 17:36:34 compute-0 nova_compute[186479]:   </metadata>
Feb 17 17:36:34 compute-0 nova_compute[186479]:   <sysinfo type="smbios">
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <system>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <entry name="manufacturer">RDO</entry>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <entry name="product">OpenStack Compute</entry>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <entry name="serial">db5b187e-9b4b-4bcf-a142-465cac63f18a</entry>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <entry name="uuid">db5b187e-9b4b-4bcf-a142-465cac63f18a</entry>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <entry name="family">Virtual Machine</entry>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     </system>
Feb 17 17:36:34 compute-0 nova_compute[186479]:   </sysinfo>
Feb 17 17:36:34 compute-0 nova_compute[186479]:   <os>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <boot dev="hd"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <smbios mode="sysinfo"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:   </os>
Feb 17 17:36:34 compute-0 nova_compute[186479]:   <features>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <acpi/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <apic/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <vmcoreinfo/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:   </features>
Feb 17 17:36:34 compute-0 nova_compute[186479]:   <clock offset="utc">
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <timer name="pit" tickpolicy="delay"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <timer name="hpet" present="no"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:   </clock>
Feb 17 17:36:34 compute-0 nova_compute[186479]:   <cpu mode="host-model" match="exact">
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <topology sockets="1" cores="1" threads="1"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:36:34 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <disk type="file" device="disk">
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/db5b187e-9b4b-4bcf-a142-465cac63f18a/disk"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <target dev="vda" bus="virtio"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <disk type="file" device="cdrom">
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <driver name="qemu" type="raw" cache="none"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.config"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <target dev="sda" bus="sata"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <interface type="ethernet">
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <mac address="fa:16:3e:29:90:91"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <driver name="vhost" rx_queue_size="512"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <mtu size="1442"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <target dev="tape70422a1-99"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <serial type="pty">
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <log file="/var/lib/nova/instances/db5b187e-9b4b-4bcf-a142-465cac63f18a/console.log" append="off"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     </serial>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <video>
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     </video>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <input type="tablet" bus="usb"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <rng model="virtio">
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <backend model="random">/dev/urandom</backend>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <controller type="usb" index="0"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     <memballoon model="virtio">
Feb 17 17:36:34 compute-0 nova_compute[186479]:       <stats period="10"/>
Feb 17 17:36:34 compute-0 nova_compute[186479]:     </memballoon>
Feb 17 17:36:34 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:36:34 compute-0 nova_compute[186479]: </domain>
Feb 17 17:36:34 compute-0 nova_compute[186479]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.532 186483 DEBUG nova.compute.manager [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Preparing to wait for external event network-vif-plugged-e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.532 186483 DEBUG oslo_concurrency.lockutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "db5b187e-9b4b-4bcf-a142-465cac63f18a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.532 186483 DEBUG oslo_concurrency.lockutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "db5b187e-9b4b-4bcf-a142-465cac63f18a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.532 186483 DEBUG oslo_concurrency.lockutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "db5b187e-9b4b-4bcf-a142-465cac63f18a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.533 186483 DEBUG nova.virt.libvirt.vif [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:36:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1141949017',display_name='tempest-TestNetworkBasicOps-server-1141949017',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1141949017',id=12,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMY9kFaPF0ad6/uB/CqSdqudglFsmVzh3fOsOsHzCokheVVxikzrTKMeUfSPm+xkyFvcPpj3vWQpfoC3CKEz3KEMBROnXSpQ3X6qvEUlwaD4nCaIZGb/XIZnqeudDtBPCA==',key_name='tempest-TestNetworkBasicOps-1963528792',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-nzibojz0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:36:28Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=db5b187e-9b4b-4bcf-a142-465cac63f18a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "address": "fa:16:3e:29:90:91", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape70422a1-99", "ovs_interfaceid": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.533 186483 DEBUG nova.network.os_vif_util [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "address": "fa:16:3e:29:90:91", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape70422a1-99", "ovs_interfaceid": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.534 186483 DEBUG nova.network.os_vif_util [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:29:90:91,bridge_name='br-int',has_traffic_filtering=True,id=e70422a1-9986-4f6b-b8a3-4c7f3a7d7710,network=Network(641e4c07-2901-48bf-a652-443b9ce7f994),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape70422a1-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.534 186483 DEBUG os_vif [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:90:91,bridge_name='br-int',has_traffic_filtering=True,id=e70422a1-9986-4f6b-b8a3-4c7f3a7d7710,network=Network(641e4c07-2901-48bf-a652-443b9ce7f994),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape70422a1-99') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.534 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.535 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.535 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.537 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.538 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape70422a1-99, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.538 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape70422a1-99, col_values=(('external_ids', {'iface-id': 'e70422a1-9986-4f6b-b8a3-4c7f3a7d7710', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:29:90:91', 'vm-uuid': 'db5b187e-9b4b-4bcf-a142-465cac63f18a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.539 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:34 compute-0 NetworkManager[56323]: <info>  [1771349794.5412] manager: (tape70422a1-99): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.542 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.545 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.545 186483 INFO os_vif [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:90:91,bridge_name='br-int',has_traffic_filtering=True,id=e70422a1-9986-4f6b-b8a3-4c7f3a7d7710,network=Network(641e4c07-2901-48bf-a652-443b9ce7f994),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape70422a1-99')
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.587 186483 DEBUG nova.virt.libvirt.driver [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.587 186483 DEBUG nova.virt.libvirt.driver [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.587 186483 DEBUG nova.virt.libvirt.driver [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No VIF found with MAC fa:16:3e:29:90:91, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.588 186483 INFO nova.virt.libvirt.driver [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Using config drive
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.867 186483 INFO nova.virt.libvirt.driver [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Creating config drive at /var/lib/nova/instances/db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.config
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.870 186483 DEBUG oslo_concurrency.processutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpky_vzra0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:36:34 compute-0 nova_compute[186479]: 2026-02-17 17:36:34.993 186483 DEBUG oslo_concurrency.processutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpky_vzra0" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:36:35 compute-0 kernel: tape70422a1-99: entered promiscuous mode
Feb 17 17:36:35 compute-0 ovn_controller[96568]: 2026-02-17T17:36:35Z|00151|binding|INFO|Claiming lport e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 for this chassis.
Feb 17 17:36:35 compute-0 ovn_controller[96568]: 2026-02-17T17:36:35Z|00152|binding|INFO|e70422a1-9986-4f6b-b8a3-4c7f3a7d7710: Claiming fa:16:3e:29:90:91 10.100.0.4
Feb 17 17:36:35 compute-0 NetworkManager[56323]: <info>  [1771349795.0409] manager: (tape70422a1-99): new Tun device (/org/freedesktop/NetworkManager/Devices/85)
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.040 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.045 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:35 compute-0 ovn_controller[96568]: 2026-02-17T17:36:35Z|00153|binding|INFO|Setting lport e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 ovn-installed in OVS
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.046 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:35 compute-0 ovn_controller[96568]: 2026-02-17T17:36:35Z|00154|binding|INFO|Setting lport e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 up in Southbound
Feb 17 17:36:35 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:35.047 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:90:91 10.100.0.4'], port_security=['fa:16:3e:29:90:91 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-641e4c07-2901-48bf-a652-443b9ce7f994', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7007f5af-2625-4331-8d6a-75bbeb10d462', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9416c76a-6d88-4459-97f8-aa5f83d3ca8b, chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=e70422a1-9986-4f6b-b8a3-4c7f3a7d7710) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:36:35 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:35.048 105898 INFO neutron.agent.ovn.metadata.agent [-] Port e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 in datapath 641e4c07-2901-48bf-a652-443b9ce7f994 bound to our chassis
Feb 17 17:36:35 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:35.049 105898 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 641e4c07-2901-48bf-a652-443b9ce7f994
Feb 17 17:36:35 compute-0 systemd-machined[155877]: New machine qemu-12-instance-0000000c.
Feb 17 17:36:35 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:35.062 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[b1dcb888-1045-4ca1-8f72-64c546944032]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:36:35 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-0000000c.
Feb 17 17:36:35 compute-0 systemd-udevd[219888]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:36:35 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:35.084 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[493a4311-9bcc-4fd0-b1f5-b60870ed88e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:36:35 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:35.088 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[f9711d6d-40fc-41fd-9ac3-3dc0870ad64e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:36:35 compute-0 NetworkManager[56323]: <info>  [1771349795.0926] device (tape70422a1-99): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 17 17:36:35 compute-0 NetworkManager[56323]: <info>  [1771349795.0936] device (tape70422a1-99): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 17 17:36:35 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:35.114 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[cec6ca18-6b04-4e9a-83c7-de5f4c9bc0a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:36:35 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:35.128 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[956e5233-2d85-4c5b-8c45-6a36cbd073cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap641e4c07-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:26:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353000, 'reachable_time': 20104, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219898, 'error': None, 'target': 'ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:36:35 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:35.139 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[35d096e0-fc79-4e0e-a18b-8f5f03235cee]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap641e4c07-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353008, 'tstamp': 353008}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219900, 'error': None, 'target': 'ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap641e4c07-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353010, 'tstamp': 353010}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219900, 'error': None, 'target': 'ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.141 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:35 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:35.142 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap641e4c07-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:36:35 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:35.144 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap641e4c07-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:36:35 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:35.145 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:36:35 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:35.145 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap641e4c07-20, col_values=(('external_ids', {'iface-id': 'c7627b50-72b7-4f6f-a9ad-2ceb9b8b24e3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:36:35 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:35.145 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.347 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349795.3474135, db5b187e-9b4b-4bcf-a142-465cac63f18a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.348 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] VM Started (Lifecycle Event)
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.365 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.369 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349795.3483315, db5b187e-9b4b-4bcf-a142-465cac63f18a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.370 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] VM Paused (Lifecycle Event)
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.394 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.397 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.415 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.540 186483 DEBUG nova.compute.manager [req-12eab046-1b9d-4af9-a4de-c2c9dfd09aca req-b5311783-0277-449c-9f23-f910e533397d 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Received event network-vif-plugged-e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.541 186483 DEBUG oslo_concurrency.lockutils [req-12eab046-1b9d-4af9-a4de-c2c9dfd09aca req-b5311783-0277-449c-9f23-f910e533397d 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "db5b187e-9b4b-4bcf-a142-465cac63f18a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.541 186483 DEBUG oslo_concurrency.lockutils [req-12eab046-1b9d-4af9-a4de-c2c9dfd09aca req-b5311783-0277-449c-9f23-f910e533397d 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "db5b187e-9b4b-4bcf-a142-465cac63f18a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.542 186483 DEBUG oslo_concurrency.lockutils [req-12eab046-1b9d-4af9-a4de-c2c9dfd09aca req-b5311783-0277-449c-9f23-f910e533397d 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "db5b187e-9b4b-4bcf-a142-465cac63f18a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.542 186483 DEBUG nova.compute.manager [req-12eab046-1b9d-4af9-a4de-c2c9dfd09aca req-b5311783-0277-449c-9f23-f910e533397d 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Processing event network-vif-plugged-e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.543 186483 DEBUG nova.compute.manager [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.547 186483 DEBUG nova.virt.libvirt.driver [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.548 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349795.5473554, db5b187e-9b4b-4bcf-a142-465cac63f18a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.548 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] VM Resumed (Lifecycle Event)
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.553 186483 INFO nova.virt.libvirt.driver [-] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Instance spawned successfully.
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.553 186483 DEBUG nova.virt.libvirt.driver [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.575 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.583 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.588 186483 DEBUG nova.virt.libvirt.driver [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.589 186483 DEBUG nova.virt.libvirt.driver [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.590 186483 DEBUG nova.virt.libvirt.driver [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.591 186483 DEBUG nova.virt.libvirt.driver [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.591 186483 DEBUG nova.virt.libvirt.driver [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.592 186483 DEBUG nova.virt.libvirt.driver [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.621 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.682 186483 INFO nova.compute.manager [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Took 7.08 seconds to spawn the instance on the hypervisor.
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.683 186483 DEBUG nova.compute.manager [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.759 186483 INFO nova.compute.manager [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Took 7.55 seconds to build instance.
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.773 186483 DEBUG oslo_concurrency.lockutils [None req-2006e9f5-73bc-4b85-a3c3-ac3da0ccc31d 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "db5b187e-9b4b-4bcf-a142-465cac63f18a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.682s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.818 186483 DEBUG nova.network.neutron [req-6891deb0-d34f-4549-9947-8c2f2f2407d1 req-98f61188-e5a4-4e0f-86f2-1ee5e7fdc775 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Updated VIF entry in instance network info cache for port e70422a1-9986-4f6b-b8a3-4c7f3a7d7710. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.819 186483 DEBUG nova.network.neutron [req-6891deb0-d34f-4549-9947-8c2f2f2407d1 req-98f61188-e5a4-4e0f-86f2-1ee5e7fdc775 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Updating instance_info_cache with network_info: [{"id": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "address": "fa:16:3e:29:90:91", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape70422a1-99", "ovs_interfaceid": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:36:35 compute-0 nova_compute[186479]: 2026-02-17 17:36:35.834 186483 DEBUG oslo_concurrency.lockutils [req-6891deb0-d34f-4549-9947-8c2f2f2407d1 req-98f61188-e5a4-4e0f-86f2-1ee5e7fdc775 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-db5b187e-9b4b-4bcf-a142-465cac63f18a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:36:36 compute-0 podman[219908]: 2026-02-17 17:36:36.716223059 +0000 UTC m=+0.052614545 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 17 17:36:37 compute-0 nova_compute[186479]: 2026-02-17 17:36:37.613 186483 DEBUG nova.compute.manager [req-7d14378f-32ba-490d-a144-a0e8eddf24b2 req-6afbad1b-232a-40aa-880c-56181c43ea07 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Received event network-vif-plugged-e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:36:37 compute-0 nova_compute[186479]: 2026-02-17 17:36:37.614 186483 DEBUG oslo_concurrency.lockutils [req-7d14378f-32ba-490d-a144-a0e8eddf24b2 req-6afbad1b-232a-40aa-880c-56181c43ea07 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "db5b187e-9b4b-4bcf-a142-465cac63f18a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:36:37 compute-0 nova_compute[186479]: 2026-02-17 17:36:37.615 186483 DEBUG oslo_concurrency.lockutils [req-7d14378f-32ba-490d-a144-a0e8eddf24b2 req-6afbad1b-232a-40aa-880c-56181c43ea07 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "db5b187e-9b4b-4bcf-a142-465cac63f18a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:36:37 compute-0 nova_compute[186479]: 2026-02-17 17:36:37.615 186483 DEBUG oslo_concurrency.lockutils [req-7d14378f-32ba-490d-a144-a0e8eddf24b2 req-6afbad1b-232a-40aa-880c-56181c43ea07 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "db5b187e-9b4b-4bcf-a142-465cac63f18a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:36:37 compute-0 nova_compute[186479]: 2026-02-17 17:36:37.616 186483 DEBUG nova.compute.manager [req-7d14378f-32ba-490d-a144-a0e8eddf24b2 req-6afbad1b-232a-40aa-880c-56181c43ea07 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] No waiting events found dispatching network-vif-plugged-e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:36:37 compute-0 nova_compute[186479]: 2026-02-17 17:36:37.617 186483 WARNING nova.compute.manager [req-7d14378f-32ba-490d-a144-a0e8eddf24b2 req-6afbad1b-232a-40aa-880c-56181c43ea07 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Received unexpected event network-vif-plugged-e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 for instance with vm_state active and task_state None.
Feb 17 17:36:39 compute-0 nova_compute[186479]: 2026-02-17 17:36:39.144 186483 DEBUG nova.compute.manager [req-b05f7834-fe43-4ac4-a1e2-6ef9b08867df req-d460cd2f-f7a8-45e2-a16f-2eac0906fbaa 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Received event network-changed-e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:36:39 compute-0 nova_compute[186479]: 2026-02-17 17:36:39.145 186483 DEBUG nova.compute.manager [req-b05f7834-fe43-4ac4-a1e2-6ef9b08867df req-d460cd2f-f7a8-45e2-a16f-2eac0906fbaa 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Refreshing instance network info cache due to event network-changed-e70422a1-9986-4f6b-b8a3-4c7f3a7d7710. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:36:39 compute-0 nova_compute[186479]: 2026-02-17 17:36:39.145 186483 DEBUG oslo_concurrency.lockutils [req-b05f7834-fe43-4ac4-a1e2-6ef9b08867df req-d460cd2f-f7a8-45e2-a16f-2eac0906fbaa 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-db5b187e-9b4b-4bcf-a142-465cac63f18a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:36:39 compute-0 nova_compute[186479]: 2026-02-17 17:36:39.145 186483 DEBUG oslo_concurrency.lockutils [req-b05f7834-fe43-4ac4-a1e2-6ef9b08867df req-d460cd2f-f7a8-45e2-a16f-2eac0906fbaa 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-db5b187e-9b4b-4bcf-a142-465cac63f18a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:36:39 compute-0 nova_compute[186479]: 2026-02-17 17:36:39.145 186483 DEBUG nova.network.neutron [req-b05f7834-fe43-4ac4-a1e2-6ef9b08867df req-d460cd2f-f7a8-45e2-a16f-2eac0906fbaa 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Refreshing network info cache for port e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:36:39 compute-0 nova_compute[186479]: 2026-02-17 17:36:39.541 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:40 compute-0 nova_compute[186479]: 2026-02-17 17:36:40.130 186483 DEBUG nova.network.neutron [req-b05f7834-fe43-4ac4-a1e2-6ef9b08867df req-d460cd2f-f7a8-45e2-a16f-2eac0906fbaa 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Updated VIF entry in instance network info cache for port e70422a1-9986-4f6b-b8a3-4c7f3a7d7710. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:36:40 compute-0 nova_compute[186479]: 2026-02-17 17:36:40.130 186483 DEBUG nova.network.neutron [req-b05f7834-fe43-4ac4-a1e2-6ef9b08867df req-d460cd2f-f7a8-45e2-a16f-2eac0906fbaa 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Updating instance_info_cache with network_info: [{"id": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "address": "fa:16:3e:29:90:91", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape70422a1-99", "ovs_interfaceid": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:36:40 compute-0 nova_compute[186479]: 2026-02-17 17:36:40.179 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:40 compute-0 nova_compute[186479]: 2026-02-17 17:36:40.203 186483 DEBUG oslo_concurrency.lockutils [req-b05f7834-fe43-4ac4-a1e2-6ef9b08867df req-d460cd2f-f7a8-45e2-a16f-2eac0906fbaa 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-db5b187e-9b4b-4bcf-a142-465cac63f18a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.718 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'name': 'tempest-TestNetworkBasicOps-server-583735795', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'user_id': '3f041abe92134380b8de39091bce5989', 'hostId': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.720 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'name': 'tempest-TestNetworkBasicOps-server-1141949017', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'user_id': '3f041abe92134380b8de39091bce5989', 'hostId': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.720 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.733 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/memory.usage volume: 46.515625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.749 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.750 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance db5b187e-9b4b-4bcf-a142-465cac63f18a: ceilometer.compute.pollsters.NoVolumeException
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3cb746d-28e3-482f-8e6b-a2f2ce603fad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.515625, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'timestamp': '2026-02-17T17:36:43.720816', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'instance-0000000b', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '39c0196c-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.098813504, 'message_signature': '75a392504b4e94e0e99487141370b06872180bb52095ca6012dd4e3d46a600e3'}]}, 'timestamp': '2026-02-17 17:36:43.750216', '_unique_id': '8f7374f7091d48d1b0c494756d9225a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.752 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.755 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf / tap2c569f82-92 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.755 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.757 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for db5b187e-9b4b-4bcf-a142-465cac63f18a / tape70422a1-99 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.758 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b75bd101-e194-4164-b8a4-715343380054', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-0000000b-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-tap2c569f82-92', 'timestamp': '2026-02-17T17:36:43.753011', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'tap2c569f82-92', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:00:92:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2c569f82-92'}, 'message_id': '39c354e2-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.118025103, 'message_signature': 'fb262ace1782286c21ab9b234167632ecc4045a7327bdb6f8afee74453e1c286'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-0000000c-db5b187e-9b4b-4bcf-a142-465cac63f18a-tape70422a1-99', 'timestamp': '2026-02-17T17:36:43.753011', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'tape70422a1-99', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:29:90:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape70422a1-99'}, 'message_id': '39c3c0bc-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.120992013, 'message_signature': '4ff33eff433acf971747ea84d9b4c80f327b9b4d979031d85ea44d9e3a49dd0a'}]}, 'timestamp': '2026-02-17 17:36:43.758602', '_unique_id': 'eb53737c62674b39a6e688f65ecc57be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.759 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.760 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.760 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.760 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-583735795>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1141949017>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-583735795>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1141949017>]
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.761 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.785 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.device.read.bytes volume: 29235712 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.786 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.818 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.819 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d9b8faf-a745-4419-918e-02c378843216', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29235712, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-vda', 'timestamp': '2026-02-17T17:36:43.761115', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'instance-0000000b', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '39c7f7fe-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.126127636, 'message_signature': 'e9875823c8a39a8d809cc54cfb72018c9aa4e3f1dbc258e90322865315e0ec6f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-sda', 'timestamp': '2026-02-17T17:36:43.761115', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'instance-0000000b', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '39c8071c-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.126127636, 'message_signature': '2bd9b0e5ded761aa43c01b1d0e368b0a14f0b9e31e02eb767dae9cd31d9d836c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a-vda', 'timestamp': '2026-02-17T17:36:43.761115', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'instance-0000000c', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '39cd0ce4-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.151564783, 'message_signature': 'b056e9644d137df0753a03d8162f2a021b9c1e06196e5576a1707ca0b4931e0e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a-sda', 'timestamp': '2026-02-17T17:36:43.761115', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'instance-0000000c', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '39cd175c-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.151564783, 'message_signature': 'eb493c9b23918f7b89d890c5993c650da08afe8ab3a4a2155714d5266177118c'}]}, 'timestamp': '2026-02-17 17:36:43.819717', '_unique_id': '3cf97ce3a5314828b55888ca744b5987'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.820 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.821 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.821 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.821 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-583735795>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1141949017>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-583735795>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1141949017>]
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.821 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.821 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.821 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93045728-6c6d-4f1b-b87d-480ed00c627a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-0000000b-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-tap2c569f82-92', 'timestamp': '2026-02-17T17:36:43.821698', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'tap2c569f82-92', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:00:92:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2c569f82-92'}, 'message_id': '39cd6dc4-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.118025103, 'message_signature': 'a0eadeaaa1ef7d87b9b1786fd5f291e7f11d5ac02e281d86978c71e50d6b31a6'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-0000000c-db5b187e-9b4b-4bcf-a142-465cac63f18a-tape70422a1-99', 'timestamp': '2026-02-17T17:36:43.821698', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'tape70422a1-99', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:29:90:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape70422a1-99'}, 'message_id': '39cd7620-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.120992013, 'message_signature': 'f01b83942988fea1209e06daf913319a2bb67c018331dc62153f49863ae5bc07'}]}, 'timestamp': '2026-02-17 17:36:43.822155', '_unique_id': '25946659d55a4eec86151917780d1f9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.822 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.823 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.823 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/network.incoming.bytes volume: 1436 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.823 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2767ca5c-c821-4e1a-9938-dcfceae1bb86', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1436, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-0000000b-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-tap2c569f82-92', 'timestamp': '2026-02-17T17:36:43.823388', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'tap2c569f82-92', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:00:92:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2c569f82-92'}, 'message_id': '39cdb162-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.118025103, 'message_signature': '3e9c315722d7edac373f16a173a6676722e505f8b5aa8105f118157be10ce383'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-0000000c-db5b187e-9b4b-4bcf-a142-465cac63f18a-tape70422a1-99', 'timestamp': '2026-02-17T17:36:43.823388', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'tape70422a1-99', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:29:90:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape70422a1-99'}, 'message_id': '39cdbcde-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.120992013, 'message_signature': 'f22d2d19f3fce7f0146ea8b20f68560f44fb2788a1357f06a340fba89d122cef'}]}, 'timestamp': '2026-02-17 17:36:43.823990', '_unique_id': 'e74c6fccad764eda9e1c9f16c55a0732'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.824 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.825 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.825 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.825 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1164cb3-89f8-4e7b-aeaa-1c256205a7a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-0000000b-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-tap2c569f82-92', 'timestamp': '2026-02-17T17:36:43.825430', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'tap2c569f82-92', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:00:92:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2c569f82-92'}, 'message_id': '39ce00ae-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.118025103, 'message_signature': '1e6575d2b3b6d89ba34d69433e4df00e653fcec78e13230cba766284a42587c8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-0000000c-db5b187e-9b4b-4bcf-a142-465cac63f18a-tape70422a1-99', 'timestamp': '2026-02-17T17:36:43.825430', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'tape70422a1-99', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:29:90:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape70422a1-99'}, 'message_id': '39ce0bd0-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.120992013, 'message_signature': 'e62a2ba60319498ae73d6098eb7eeddd5e5329bd83f7fae612e14e93add0a26e'}]}, 'timestamp': '2026-02-17 17:36:43.826009', '_unique_id': '5c6b722a885143e69ac23875294d0fcb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.826 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.828 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.828 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.device.read.latency volume: 561538920 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.828 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.device.read.latency volume: 48318383 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.829 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.device.read.latency volume: 453498943 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.829 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.device.read.latency volume: 2190071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55f03743-310e-48d4-bc02-838eaf6b7fb0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 561538920, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-vda', 'timestamp': '2026-02-17T17:36:43.828669', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'instance-0000000b', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '39ce7e6c-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.126127636, 'message_signature': '8cc0908fec42547b8f42721a908e175dce98e6c1847f534d79a51bc601911ff0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 48318383, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-sda', 'timestamp': '2026-02-17T17:36:43.828669', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'instance-0000000b', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '39ce8646-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.126127636, 'message_signature': '107641a42137b1799f630006a942c3c3e5f45162a9f5237c680e41ae8d13c495'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 453498943, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a-vda', 'timestamp': '2026-02-17T17:36:43.828669', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'instance-0000000c', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '39ce8ea2-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.151564783, 'message_signature': 'cc82b0fe2527835dd6e812c9d1bd1a74cf1e38fafbdc264da866d8fecff100cb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2190071, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a-sda', 'timestamp': '2026-02-17T17:36:43.828669', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'instance-0000000c', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '39ce95a0-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.151564783, 'message_signature': '24e31f4b5fe0430f5bf674e8d8c7c166bfcde8fb98774dcde140c474edaad5a9'}]}, 'timestamp': '2026-02-17 17:36:43.829495', '_unique_id': '84c7c657d7b04bfdade27b775cb53056'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.830 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.831 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.831 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-583735795>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1141949017>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-583735795>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1141949017>]
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.831 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.831 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.831 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c29fa6f0-3617-403c-b836-ba3ea77fbd36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-0000000b-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-tap2c569f82-92', 'timestamp': '2026-02-17T17:36:43.831461', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'tap2c569f82-92', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:00:92:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2c569f82-92'}, 'message_id': '39ceec9e-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.118025103, 'message_signature': '8698f491d7063caf17ed69b9e41120add07383858f9394bce3325b85eee48b90'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-0000000c-db5b187e-9b4b-4bcf-a142-465cac63f18a-tape70422a1-99', 'timestamp': '2026-02-17T17:36:43.831461', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'tape70422a1-99', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:29:90:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape70422a1-99'}, 'message_id': '39cef7f2-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.120992013, 'message_signature': 'a317fcf761cc59aa2e1abe2c06af025b35f71d7281be1b099d4cde0c75166cb0'}]}, 'timestamp': '2026-02-17 17:36:43.832046', '_unique_id': 'f46172f470ae4f9282dd5867952fb384'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.832 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.833 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.device.write.bytes volume: 72929280 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.833 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.833 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.833 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9863f57b-3652-4411-a607-b121a18aa495', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72929280, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-vda', 'timestamp': '2026-02-17T17:36:43.833077', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'instance-0000000b', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '39cf29c0-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.126127636, 'message_signature': '6058b63cb32ca325daf1696b316310299644d8a2b1fd6444661def8b50819de2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-sda', 'timestamp': '2026-02-17T17:36:43.833077', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'instance-0000000b', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '39cf3410-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.126127636, 'message_signature': '4271b733836e34b21c33a9a75f71bcd9583cf93c2aa3bce5c95c64ea2b9f73ea'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a-vda', 'timestamp': '2026-02-17T17:36:43.833077', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'instance-0000000c', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '39cf3e60-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.151564783, 'message_signature': '8f7579aa53145ecc1b9dcd3f4b771e40b13ff208df3306eeb7dfa4b6cb5be9e4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a-sda', 'timestamp': '2026-02-17T17:36:43.833077', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'instance-0000000c', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '39cf489c-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.151564783, 'message_signature': '8debe82df21c2682a5cb3a235164ec6ec350f4f6fa729e1fbab11fd44c43dccb'}]}, 'timestamp': '2026-02-17 17:36:43.834132', '_unique_id': 'c3b416445fb744d99ffe3c0b80818623'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.834 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.835 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.835 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.device.read.requests volume: 1056 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.835 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.835 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.835 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06a478f9-d31f-4f5d-86dc-a1867e7575ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1056, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-vda', 'timestamp': '2026-02-17T17:36:43.835355', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'instance-0000000b', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '39cf828a-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.126127636, 'message_signature': 'b47ba92544a3a1c538376084dca80e72b2e37709714650ab74786e2c892a0309'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-sda', 'timestamp': '2026-02-17T17:36:43.835355', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'instance-0000000b', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '39cf89a6-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.126127636, 'message_signature': '13663facafcdfb70ddef3ae428c3d88bcf1d47b24e805b9838c3deb25e2e4942'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a-vda', 'timestamp': '2026-02-17T17:36:43.835355', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'instance-0000000c', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '39cf9086-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.151564783, 'message_signature': '5ea6c365593c8808d91410e97ae3fefb3f6c199e15ba39e6226f054656c693d8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a-sda', 'timestamp': '2026-02-17T17:36:43.835355', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'instance-0000000c', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '39cf9748-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.151564783, 'message_signature': '27c10eb2083fd787c3950876502e6a8e3a1725eefba0e4b5902b50970ab13de1'}]}, 'timestamp': '2026-02-17 17:36:43.836093', '_unique_id': '7ba00c316b85405fa00737766eaccf07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.836 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/network.incoming.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f303985e-b3ba-42d4-acd6-abdc5bfc3db8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-0000000b-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-tap2c569f82-92', 'timestamp': '2026-02-17T17:36:43.837151', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'tap2c569f82-92', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:00:92:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2c569f82-92'}, 'message_id': '39cfc8da-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.118025103, 'message_signature': '954dfdbe6561e1bced76c1396d1a967ff6af16096bffcda7a33dd1281da5a6f3'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-0000000c-db5b187e-9b4b-4bcf-a142-465cac63f18a-tape70422a1-99', 'timestamp': '2026-02-17T17:36:43.837151', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'tape70422a1-99', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:29:90:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape70422a1-99'}, 'message_id': '39cfd096-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.120992013, 'message_signature': '0d31f4a26caf2981885d1df4b4b5f4d1984e409d9cbf7d9ebc0d65e08ce0bd8f'}]}, 'timestamp': '2026-02-17 17:36:43.837555', '_unique_id': 'e1dc24141f444ea099e9f8c5e8fc0bdc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.837 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.838 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.838 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.838 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88dda276-298e-4c0f-bf95-65348d7d7b11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-0000000b-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-tap2c569f82-92', 'timestamp': '2026-02-17T17:36:43.838525', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'tap2c569f82-92', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:00:92:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2c569f82-92'}, 'message_id': '39cffe5e-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.118025103, 'message_signature': 'ce8cd5c24101f9a1413c250a9fedfcfcb1c93c1e1113520d72a6ca9679c0a464'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-0000000c-db5b187e-9b4b-4bcf-a142-465cac63f18a-tape70422a1-99', 'timestamp': '2026-02-17T17:36:43.838525', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'tape70422a1-99', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:29:90:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape70422a1-99'}, 'message_id': '39d00660-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.120992013, 'message_signature': '0dca5d0710deb1e1577b96c6d1543b0a53d0aa3191c430cd2cb74719b3404faf'}]}, 'timestamp': '2026-02-17 17:36:43.838931', '_unique_id': 'aa223f1dc30149ea804986b937dfb482'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.839 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.840 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.840 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.840 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-583735795>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1141949017>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-583735795>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1141949017>]
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.840 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.840 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.840 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e301df5e-71eb-4c69-9af1-9d8500337ea0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-0000000b-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-tap2c569f82-92', 'timestamp': '2026-02-17T17:36:43.840597', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'tap2c569f82-92', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:00:92:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2c569f82-92'}, 'message_id': '39d05160-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.118025103, 'message_signature': 'd6f31e5e475302dca917e480b4328ecdd86c69447e5bb77187bc417e633cf86c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-0000000c-db5b187e-9b4b-4bcf-a142-465cac63f18a-tape70422a1-99', 'timestamp': '2026-02-17T17:36:43.840597', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'tape70422a1-99', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:29:90:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape70422a1-99'}, 'message_id': '39d05caa-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.120992013, 'message_signature': '53fe4c07b485932230268ee153be9244a4690ed34acfd7c405c9b7715043e55c'}]}, 'timestamp': '2026-02-17 17:36:43.841208', '_unique_id': 'a038b45f1f97473f84a4af81b96ee1c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:36:43 compute-0 rsyslogd[1015]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.841 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.842 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.842 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.842 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '064f2431-6572-4ebf-8ce7-3772dfab7b68', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-0000000b-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-tap2c569f82-92', 'timestamp': '2026-02-17T17:36:43.842529', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'tap2c569f82-92', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:00:92:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2c569f82-92'}, 'message_id': '39d09ae4-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.118025103, 'message_signature': 'afbd45df9a5593279dcdbd834aaf7fae7d738782d31448e945bb4c41f0758cf8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-0000000c-db5b187e-9b4b-4bcf-a142-465cac63f18a-tape70422a1-99', 'timestamp': '2026-02-17T17:36:43.842529', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'tape70422a1-99', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:29:90:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape70422a1-99'}, 'message_id': '39d0a2aa-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.120992013, 'message_signature': 'bee728375125a366f0b6de62319817341675e4dacd153c4fc15932c4d4807567'}]}, 'timestamp': '2026-02-17 17:36:43.842932', '_unique_id': '0f27f163c5b9473b8b81cd4be377ebf3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.843 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.855 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.855 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.867 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.867 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '598bff1f-dea7-4c66-8449-60420fd3f4ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-vda', 'timestamp': '2026-02-17T17:36:43.844067', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'instance-0000000b', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '39d29240-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.209099575, 'message_signature': '563c6753e9a89ec6758c81605bae751141bf2be06264585ee4a63443fe067529'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-sda', 'timestamp': '2026-02-17T17:36:43.844067', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'instance-0000000b', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '39d2a5b4-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.209099575, 'message_signature': '1662b7e67ac02de8e1a7a476811a0903bc7946548c27db9e3d431e06597942fd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a-vda', 'timestamp': '2026-02-17T17:36:43.844067', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'instance-0000000c', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '39d473c6-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.221204795, 'message_signature': '5f0af732ae1d2ce65d96b046c03a031b4519aada6b14056c91af7c5b1597b280'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a-sda', 'timestamp': '2026-02-17T17:36:43.844067', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'instance-0000000c', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '39d47dc6-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.221204795, 'message_signature': '7fb96f5b6faa9339ff702d357089bce4b02275022c322d99d2920825b19c3f2d'}]}, 'timestamp': '2026-02-17 17:36:43.868217', '_unique_id': '7c31c075a6c649588dcc7716db8fad43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.869 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/cpu volume: 10660000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/cpu volume: 7870000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b129a59-31e9-4526-bf0a-4454b122de42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10660000000, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'timestamp': '2026-02-17T17:36:43.869889', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'instance-0000000b', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '39d4c83a-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.098813504, 'message_signature': '50375d5df6f0a6049fa2ff353a66257b47c12f49a30c02628756176d3a9a5a5a'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7870000000, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'timestamp': '2026-02-17T17:36:43.869889', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'instance-0000000c', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '39d4d1d6-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.114637402, 'message_signature': 'cda2856593a757e8f461172431c7302c819addc3e1c1ae96d3012a84cc5024ec'}]}, 'timestamp': '2026-02-17 17:36:43.870370', '_unique_id': '99c009d49bdd43399110991d2ef8ca5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.870 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.871 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.871 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.device.allocation volume: 30023680 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.871 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.872 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.872 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88a99e12-02ca-44f5-9620-1a8e20b139c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30023680, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-vda', 'timestamp': '2026-02-17T17:36:43.871737', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'instance-0000000b', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '39d51010-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.209099575, 'message_signature': 'e31b745116445e832ad2a1c6c14db321c1a92dd32b68313f314da338d8354622'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-sda', 'timestamp': '2026-02-17T17:36:43.871737', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'instance-0000000b', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '39d517cc-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.209099575, 'message_signature': '53ff56f2c67f9c3b7af33f2ecc93797dcf7db28b01188e2a30620f51c8568956'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a-vda', 'timestamp': '2026-02-17T17:36:43.871737', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'instance-0000000c', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '39d522e4-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.221204795, 'message_signature': '858f344bcca27004c471a96fdf3577b2b44c279455f305af7ac0bb7e1db3be68'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a-sda', 'timestamp': '2026-02-17T17:36:43.871737', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'instance-0000000c', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '39d52c4e-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.221204795, 'message_signature': '5ea214b0ac050525169dcdae4adebaf3ac780ea7f06d9a4a980fc7f48c844fd4'}]}, 'timestamp': '2026-02-17 17:36:43.872699', '_unique_id': 'e804b0efacd8459b8b7ece3f820975d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.873 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.874 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.874 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.874 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.874 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.874 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93f76707-76a3-48a9-b75b-5670359e099b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-vda', 'timestamp': '2026-02-17T17:36:43.874159', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'instance-0000000b', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '39d56ede-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.209099575, 'message_signature': 'cdcd891135fbf03682eefb19ec86871d5abe739d206c350122f4b34e3dfeca4c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-sda', 'timestamp': '2026-02-17T17:36:43.874159', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'instance-0000000b', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '39d5774e-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.209099575, 'message_signature': '31aea347094cec0ddc81f19993b306b194a7968e57dda986f797831261276dd3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a-vda', 'timestamp': '2026-02-17T17:36:43.874159', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'instance-0000000c', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '39d581c6-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.221204795, 'message_signature': '84536e9a7f6369f215bbf42115d91940c0581f8bef02464dd1d9f4a5fe932372'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a-sda', 'timestamp': '2026-02-17T17:36:43.874159', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'instance-0000000c', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '39d58be4-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.221204795, 'message_signature': '9b986cd9d611f817c25b8fa1bfe2d3c71659a37e3d1e8d2c311a3e6a9143ab54'}]}, 'timestamp': '2026-02-17 17:36:43.875168', '_unique_id': 'f521996f2e564dbfada23a08cab11d33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.875 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.876 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.876 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.876 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69569ce5-150d-48c0-8826-a38e0d456897', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-0000000b-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-tap2c569f82-92', 'timestamp': '2026-02-17T17:36:43.876347', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'tap2c569f82-92', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:00:92:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2c569f82-92'}, 'message_id': '39d5c47e-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.118025103, 'message_signature': 'e9349095f50e8656c02dc1c911952e3c9b458b8a47182a84c703d7459c6c2df6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'instance-0000000c-db5b187e-9b4b-4bcf-a142-465cac63f18a-tape70422a1-99', 'timestamp': '2026-02-17T17:36:43.876347', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'tape70422a1-99', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:29:90:91', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape70422a1-99'}, 'message_id': '39d5ce2e-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.120992013, 'message_signature': '7607193f1cecd5ddb647b64c99b53a72e444d565538d3d1711d263d0a53e29b6'}]}, 'timestamp': '2026-02-17 17:36:43.876823', '_unique_id': '95fe47d4e155487aa8fe5a53302da0d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.877 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.878 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.878 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.device.write.requests volume: 305 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.878 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.878 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.878 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6823733-4ca3-4302-afcc-c0c5ec65621d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 305, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-vda', 'timestamp': '2026-02-17T17:36:43.878115', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'instance-0000000b', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '39d6098e-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.126127636, 'message_signature': '60a0ded8d2eff1bae0feb58c0ab917b9ff1856d3740143041e212eea12aa8f92'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-sda', 'timestamp': '2026-02-17T17:36:43.878115', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'instance-0000000b', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '39d6142e-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.126127636, 'message_signature': '333b4bf42b693f490e8b62fcae15fe5dfa93c7fe7419d8998bcb2dc064829c5a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a-vda', 'timestamp': '2026-02-17T17:36:43.878115', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'instance-0000000c', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '39d61e74-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.151564783, 'message_signature': '6cfd5cf195b60c9d6077d2c8d221989adc389883a527422bcdfee9075033c2dd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a-sda', 'timestamp': '2026-02-17T17:36:43.878115', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'instance-0000000c', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '39d62856-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.151564783, 'message_signature': 'be648c44667a249100b26d0aeba06bbbb4590f8856406aff0594b976a4f654d2'}]}, 'timestamp': '2026-02-17 17:36:43.879171', '_unique_id': '45ea9190d32d4cfeba4975494fb01540'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.879 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.881 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.881 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.device.write.latency volume: 1937359429 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.882 12 DEBUG ceilometer.compute.pollsters [-] 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.882 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.882 12 DEBUG ceilometer.compute.pollsters [-] db5b187e-9b4b-4bcf-a142-465cac63f18a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f426a2d-2242-4b7e-a117-1b3c840a81fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1937359429, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-vda', 'timestamp': '2026-02-17T17:36:43.881838', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'instance-0000000b', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '39d69aca-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.126127636, 'message_signature': '1b358863567f510558bd2fcaebebe74551d59096860b107a5a9bd2174baff60f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-sda', 'timestamp': '2026-02-17T17:36:43.881838', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-583735795', 'name': 'instance-0000000b', 'instance_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '39d6a614-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.126127636, 'message_signature': '9e01c8323ba4ae4466a97a0e1d2363876cccf6a3cec9521150a15a3ddeac9888'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a-vda', 'timestamp': '2026-02-17T17:36:43.881838', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'instance-0000000c', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '39d6adc6-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.151564783, 'message_signature': '1afa223031d619c3cf88c5a246c44f7821525a81f913d98716feb16a563c8451'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_name': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_name': None, 'resource_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a-sda', 'timestamp': '2026-02-17T17:36:43.881838', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1141949017', 'name': 'instance-0000000c', 'instance_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'instance_type': 'm1.nano', 'host': '657d91c4427e0149c4ac13cd9148127994aadf144a41b082a04b8d28', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '5ebd41ec-8360-4181-bce3-0c0dc586cdb2', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}, 'image_ref': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '39d6b7b2-0c27-11f1-ab0d-fa163e76883c', 'monotonic_time': 3559.151564783, 'message_signature': 'b966250e3b55b0ae1c0caeadd22e91f64d87f7fa392ce1c2b564cd18a7503b48'}]}, 'timestamp': '2026-02-17 17:36:43.882817', '_unique_id': '2b62fe1329754695aa469c4ca9a07bfc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     yield
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 17 17:36:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:36:43.883 12 ERROR oslo_messaging.notify.messaging 
Feb 17 17:36:43 compute-0 rsyslogd[1015]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 17 17:36:44 compute-0 nova_compute[186479]: 2026-02-17 17:36:44.588 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:44 compute-0 podman[219933]: 2026-02-17 17:36:44.720890456 +0000 UTC m=+0.063062276 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 17 17:36:45 compute-0 nova_compute[186479]: 2026-02-17 17:36:45.181 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:47 compute-0 ovn_controller[96568]: 2026-02-17T17:36:47Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:29:90:91 10.100.0.4
Feb 17 17:36:47 compute-0 ovn_controller[96568]: 2026-02-17T17:36:47Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:29:90:91 10.100.0.4
Feb 17 17:36:49 compute-0 nova_compute[186479]: 2026-02-17 17:36:49.592 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:49 compute-0 podman[219977]: 2026-02-17 17:36:49.710125184 +0000 UTC m=+0.050156068 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 17 17:36:50 compute-0 nova_compute[186479]: 2026-02-17 17:36:50.184 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:53 compute-0 podman[220001]: 2026-02-17 17:36:53.740878095 +0000 UTC m=+0.086219208 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1770267347, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9)
Feb 17 17:36:54 compute-0 nova_compute[186479]: 2026-02-17 17:36:54.596 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:55 compute-0 nova_compute[186479]: 2026-02-17 17:36:55.187 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:55 compute-0 nova_compute[186479]: 2026-02-17 17:36:55.507 186483 INFO nova.compute.manager [None req-82d7009d-0b61-4e66-a0d7-c23cd59b1200 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Get console output
Feb 17 17:36:55 compute-0 nova_compute[186479]: 2026-02-17 17:36:55.513 215112 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 17 17:36:56 compute-0 nova_compute[186479]: 2026-02-17 17:36:56.597 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:56.598 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '7a:bf:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:a1:c8:7d:f2:63'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:36:56 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:36:56.600 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 17 17:36:56 compute-0 nova_compute[186479]: 2026-02-17 17:36:56.739 186483 DEBUG nova.compute.manager [req-6b333a88-7fc9-4db7-82aa-1c11de42675f req-a6c93fa6-85c5-4c03-a2d6-cc287b654465 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Received event network-changed-2c569f82-9221-46a0-b481-e5d95c02ed5c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:36:56 compute-0 nova_compute[186479]: 2026-02-17 17:36:56.739 186483 DEBUG nova.compute.manager [req-6b333a88-7fc9-4db7-82aa-1c11de42675f req-a6c93fa6-85c5-4c03-a2d6-cc287b654465 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Refreshing instance network info cache due to event network-changed-2c569f82-9221-46a0-b481-e5d95c02ed5c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:36:56 compute-0 nova_compute[186479]: 2026-02-17 17:36:56.740 186483 DEBUG oslo_concurrency.lockutils [req-6b333a88-7fc9-4db7-82aa-1c11de42675f req-a6c93fa6-85c5-4c03-a2d6-cc287b654465 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:36:56 compute-0 nova_compute[186479]: 2026-02-17 17:36:56.741 186483 DEBUG oslo_concurrency.lockutils [req-6b333a88-7fc9-4db7-82aa-1c11de42675f req-a6c93fa6-85c5-4c03-a2d6-cc287b654465 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:36:56 compute-0 nova_compute[186479]: 2026-02-17 17:36:56.742 186483 DEBUG nova.network.neutron [req-6b333a88-7fc9-4db7-82aa-1c11de42675f req-a6c93fa6-85c5-4c03-a2d6-cc287b654465 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Refreshing network info cache for port 2c569f82-9221-46a0-b481-e5d95c02ed5c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:36:56 compute-0 nova_compute[186479]: 2026-02-17 17:36:56.776 186483 DEBUG nova.compute.manager [req-abb049dc-51c5-401e-851c-f054129089a3 req-b87de4aa-b7d8-4fd1-bad1-48dcb0311bd9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Received event network-vif-unplugged-2c569f82-9221-46a0-b481-e5d95c02ed5c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:36:56 compute-0 nova_compute[186479]: 2026-02-17 17:36:56.776 186483 DEBUG oslo_concurrency.lockutils [req-abb049dc-51c5-401e-851c-f054129089a3 req-b87de4aa-b7d8-4fd1-bad1-48dcb0311bd9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:36:56 compute-0 nova_compute[186479]: 2026-02-17 17:36:56.777 186483 DEBUG oslo_concurrency.lockutils [req-abb049dc-51c5-401e-851c-f054129089a3 req-b87de4aa-b7d8-4fd1-bad1-48dcb0311bd9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:36:56 compute-0 nova_compute[186479]: 2026-02-17 17:36:56.777 186483 DEBUG oslo_concurrency.lockutils [req-abb049dc-51c5-401e-851c-f054129089a3 req-b87de4aa-b7d8-4fd1-bad1-48dcb0311bd9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:36:56 compute-0 nova_compute[186479]: 2026-02-17 17:36:56.777 186483 DEBUG nova.compute.manager [req-abb049dc-51c5-401e-851c-f054129089a3 req-b87de4aa-b7d8-4fd1-bad1-48dcb0311bd9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] No waiting events found dispatching network-vif-unplugged-2c569f82-9221-46a0-b481-e5d95c02ed5c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:36:56 compute-0 nova_compute[186479]: 2026-02-17 17:36:56.778 186483 WARNING nova.compute.manager [req-abb049dc-51c5-401e-851c-f054129089a3 req-b87de4aa-b7d8-4fd1-bad1-48dcb0311bd9 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Received unexpected event network-vif-unplugged-2c569f82-9221-46a0-b481-e5d95c02ed5c for instance with vm_state active and task_state None.
Feb 17 17:36:57 compute-0 nova_compute[186479]: 2026-02-17 17:36:57.772 186483 INFO nova.compute.manager [None req-709aaabe-74e5-49ec-ae7c-0ea924117551 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Get console output
Feb 17 17:36:57 compute-0 nova_compute[186479]: 2026-02-17 17:36:57.778 215112 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 17 17:36:58 compute-0 nova_compute[186479]: 2026-02-17 17:36:58.642 186483 DEBUG nova.network.neutron [req-6b333a88-7fc9-4db7-82aa-1c11de42675f req-a6c93fa6-85c5-4c03-a2d6-cc287b654465 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Updated VIF entry in instance network info cache for port 2c569f82-9221-46a0-b481-e5d95c02ed5c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:36:58 compute-0 nova_compute[186479]: 2026-02-17 17:36:58.643 186483 DEBUG nova.network.neutron [req-6b333a88-7fc9-4db7-82aa-1c11de42675f req-a6c93fa6-85c5-4c03-a2d6-cc287b654465 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Updating instance_info_cache with network_info: [{"id": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "address": "fa:16:3e:00:92:e0", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c569f82-92", "ovs_interfaceid": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:36:58 compute-0 nova_compute[186479]: 2026-02-17 17:36:58.665 186483 DEBUG oslo_concurrency.lockutils [req-6b333a88-7fc9-4db7-82aa-1c11de42675f req-a6c93fa6-85c5-4c03-a2d6-cc287b654465 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:36:58 compute-0 nova_compute[186479]: 2026-02-17 17:36:58.855 186483 DEBUG nova.compute.manager [req-90d4ed78-928f-4f62-b3bb-bba3f535a068 req-96656357-d86d-46ac-adc5-918a96412c26 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Received event network-vif-plugged-2c569f82-9221-46a0-b481-e5d95c02ed5c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:36:58 compute-0 nova_compute[186479]: 2026-02-17 17:36:58.855 186483 DEBUG oslo_concurrency.lockutils [req-90d4ed78-928f-4f62-b3bb-bba3f535a068 req-96656357-d86d-46ac-adc5-918a96412c26 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:36:58 compute-0 nova_compute[186479]: 2026-02-17 17:36:58.855 186483 DEBUG oslo_concurrency.lockutils [req-90d4ed78-928f-4f62-b3bb-bba3f535a068 req-96656357-d86d-46ac-adc5-918a96412c26 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:36:58 compute-0 nova_compute[186479]: 2026-02-17 17:36:58.856 186483 DEBUG oslo_concurrency.lockutils [req-90d4ed78-928f-4f62-b3bb-bba3f535a068 req-96656357-d86d-46ac-adc5-918a96412c26 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:36:58 compute-0 nova_compute[186479]: 2026-02-17 17:36:58.856 186483 DEBUG nova.compute.manager [req-90d4ed78-928f-4f62-b3bb-bba3f535a068 req-96656357-d86d-46ac-adc5-918a96412c26 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] No waiting events found dispatching network-vif-plugged-2c569f82-9221-46a0-b481-e5d95c02ed5c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:36:58 compute-0 nova_compute[186479]: 2026-02-17 17:36:58.856 186483 WARNING nova.compute.manager [req-90d4ed78-928f-4f62-b3bb-bba3f535a068 req-96656357-d86d-46ac-adc5-918a96412c26 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Received unexpected event network-vif-plugged-2c569f82-9221-46a0-b481-e5d95c02ed5c for instance with vm_state active and task_state None.
Feb 17 17:36:59 compute-0 nova_compute[186479]: 2026-02-17 17:36:59.600 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:36:59 compute-0 nova_compute[186479]: 2026-02-17 17:36:59.876 186483 DEBUG nova.compute.manager [req-5a4768f3-01aa-4727-bb89-2acea5d26623 req-b0d272c1-74cc-41f0-a93d-1815fe8efecc 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Received event network-changed-2c569f82-9221-46a0-b481-e5d95c02ed5c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:36:59 compute-0 nova_compute[186479]: 2026-02-17 17:36:59.877 186483 DEBUG nova.compute.manager [req-5a4768f3-01aa-4727-bb89-2acea5d26623 req-b0d272c1-74cc-41f0-a93d-1815fe8efecc 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Refreshing instance network info cache due to event network-changed-2c569f82-9221-46a0-b481-e5d95c02ed5c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:36:59 compute-0 nova_compute[186479]: 2026-02-17 17:36:59.877 186483 DEBUG oslo_concurrency.lockutils [req-5a4768f3-01aa-4727-bb89-2acea5d26623 req-b0d272c1-74cc-41f0-a93d-1815fe8efecc 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:36:59 compute-0 nova_compute[186479]: 2026-02-17 17:36:59.878 186483 DEBUG oslo_concurrency.lockutils [req-5a4768f3-01aa-4727-bb89-2acea5d26623 req-b0d272c1-74cc-41f0-a93d-1815fe8efecc 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:36:59 compute-0 nova_compute[186479]: 2026-02-17 17:36:59.878 186483 DEBUG nova.network.neutron [req-5a4768f3-01aa-4727-bb89-2acea5d26623 req-b0d272c1-74cc-41f0-a93d-1815fe8efecc 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Refreshing network info cache for port 2c569f82-9221-46a0-b481-e5d95c02ed5c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.044 186483 INFO nova.compute.manager [None req-aa5d5541-f546-416f-a71d-c289c6b24e81 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Get console output
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.049 215112 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.189 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.932 186483 DEBUG nova.compute.manager [req-d4bc53ed-14c7-40a5-80b6-c04c029eb651 req-e2338d2a-f5b5-4eb9-bc5c-0831fce8a6a3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Received event network-vif-plugged-2c569f82-9221-46a0-b481-e5d95c02ed5c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.933 186483 DEBUG oslo_concurrency.lockutils [req-d4bc53ed-14c7-40a5-80b6-c04c029eb651 req-e2338d2a-f5b5-4eb9-bc5c-0831fce8a6a3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.934 186483 DEBUG oslo_concurrency.lockutils [req-d4bc53ed-14c7-40a5-80b6-c04c029eb651 req-e2338d2a-f5b5-4eb9-bc5c-0831fce8a6a3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.934 186483 DEBUG oslo_concurrency.lockutils [req-d4bc53ed-14c7-40a5-80b6-c04c029eb651 req-e2338d2a-f5b5-4eb9-bc5c-0831fce8a6a3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.935 186483 DEBUG nova.compute.manager [req-d4bc53ed-14c7-40a5-80b6-c04c029eb651 req-e2338d2a-f5b5-4eb9-bc5c-0831fce8a6a3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] No waiting events found dispatching network-vif-plugged-2c569f82-9221-46a0-b481-e5d95c02ed5c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.935 186483 WARNING nova.compute.manager [req-d4bc53ed-14c7-40a5-80b6-c04c029eb651 req-e2338d2a-f5b5-4eb9-bc5c-0831fce8a6a3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Received unexpected event network-vif-plugged-2c569f82-9221-46a0-b481-e5d95c02ed5c for instance with vm_state active and task_state None.
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.936 186483 DEBUG nova.compute.manager [req-d4bc53ed-14c7-40a5-80b6-c04c029eb651 req-e2338d2a-f5b5-4eb9-bc5c-0831fce8a6a3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Received event network-vif-plugged-2c569f82-9221-46a0-b481-e5d95c02ed5c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.936 186483 DEBUG oslo_concurrency.lockutils [req-d4bc53ed-14c7-40a5-80b6-c04c029eb651 req-e2338d2a-f5b5-4eb9-bc5c-0831fce8a6a3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.937 186483 DEBUG oslo_concurrency.lockutils [req-d4bc53ed-14c7-40a5-80b6-c04c029eb651 req-e2338d2a-f5b5-4eb9-bc5c-0831fce8a6a3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.937 186483 DEBUG oslo_concurrency.lockutils [req-d4bc53ed-14c7-40a5-80b6-c04c029eb651 req-e2338d2a-f5b5-4eb9-bc5c-0831fce8a6a3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.938 186483 DEBUG nova.compute.manager [req-d4bc53ed-14c7-40a5-80b6-c04c029eb651 req-e2338d2a-f5b5-4eb9-bc5c-0831fce8a6a3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] No waiting events found dispatching network-vif-plugged-2c569f82-9221-46a0-b481-e5d95c02ed5c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.938 186483 WARNING nova.compute.manager [req-d4bc53ed-14c7-40a5-80b6-c04c029eb651 req-e2338d2a-f5b5-4eb9-bc5c-0831fce8a6a3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Received unexpected event network-vif-plugged-2c569f82-9221-46a0-b481-e5d95c02ed5c for instance with vm_state active and task_state None.
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.939 186483 DEBUG nova.compute.manager [req-d4bc53ed-14c7-40a5-80b6-c04c029eb651 req-e2338d2a-f5b5-4eb9-bc5c-0831fce8a6a3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Received event network-changed-e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.939 186483 DEBUG nova.compute.manager [req-d4bc53ed-14c7-40a5-80b6-c04c029eb651 req-e2338d2a-f5b5-4eb9-bc5c-0831fce8a6a3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Refreshing instance network info cache due to event network-changed-e70422a1-9986-4f6b-b8a3-4c7f3a7d7710. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.940 186483 DEBUG oslo_concurrency.lockutils [req-d4bc53ed-14c7-40a5-80b6-c04c029eb651 req-e2338d2a-f5b5-4eb9-bc5c-0831fce8a6a3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-db5b187e-9b4b-4bcf-a142-465cac63f18a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.940 186483 DEBUG oslo_concurrency.lockutils [req-d4bc53ed-14c7-40a5-80b6-c04c029eb651 req-e2338d2a-f5b5-4eb9-bc5c-0831fce8a6a3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-db5b187e-9b4b-4bcf-a142-465cac63f18a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.941 186483 DEBUG nova.network.neutron [req-d4bc53ed-14c7-40a5-80b6-c04c029eb651 req-e2338d2a-f5b5-4eb9-bc5c-0831fce8a6a3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Refreshing network info cache for port e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.976 186483 DEBUG oslo_concurrency.lockutils [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "db5b187e-9b4b-4bcf-a142-465cac63f18a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.976 186483 DEBUG oslo_concurrency.lockutils [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "db5b187e-9b4b-4bcf-a142-465cac63f18a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.976 186483 DEBUG oslo_concurrency.lockutils [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "db5b187e-9b4b-4bcf-a142-465cac63f18a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.977 186483 DEBUG oslo_concurrency.lockutils [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "db5b187e-9b4b-4bcf-a142-465cac63f18a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.977 186483 DEBUG oslo_concurrency.lockutils [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "db5b187e-9b4b-4bcf-a142-465cac63f18a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.978 186483 INFO nova.compute.manager [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Terminating instance
Feb 17 17:37:00 compute-0 nova_compute[186479]: 2026-02-17 17:37:00.979 186483 DEBUG nova.compute.manager [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 17 17:37:01 compute-0 kernel: tape70422a1-99 (unregistering): left promiscuous mode
Feb 17 17:37:01 compute-0 NetworkManager[56323]: <info>  [1771349821.0078] device (tape70422a1-99): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 17 17:37:01 compute-0 ovn_controller[96568]: 2026-02-17T17:37:01Z|00155|binding|INFO|Releasing lport e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 from this chassis (sb_readonly=0)
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.011 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:01 compute-0 ovn_controller[96568]: 2026-02-17T17:37:01Z|00156|binding|INFO|Setting lport e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 down in Southbound
Feb 17 17:37:01 compute-0 ovn_controller[96568]: 2026-02-17T17:37:01Z|00157|binding|INFO|Removing iface tape70422a1-99 ovn-installed in OVS
Feb 17 17:37:01 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:01.019 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:90:91 10.100.0.4'], port_security=['fa:16:3e:29:90:91 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'db5b187e-9b4b-4bcf-a142-465cac63f18a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-641e4c07-2901-48bf-a652-443b9ce7f994', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7007f5af-2625-4331-8d6a-75bbeb10d462', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9416c76a-6d88-4459-97f8-aa5f83d3ca8b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=e70422a1-9986-4f6b-b8a3-4c7f3a7d7710) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.022 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:01 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:01.023 105898 INFO neutron.agent.ovn.metadata.agent [-] Port e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 in datapath 641e4c07-2901-48bf-a652-443b9ce7f994 unbound from our chassis
Feb 17 17:37:01 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:01.025 105898 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 641e4c07-2901-48bf-a652-443b9ce7f994
Feb 17 17:37:01 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Feb 17 17:37:01 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Consumed 12.415s CPU time.
Feb 17 17:37:01 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:01.041 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[1754b8be-690a-4168-a402-61b8f07e9b05]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:01 compute-0 systemd-machined[155877]: Machine qemu-12-instance-0000000c terminated.
Feb 17 17:37:01 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:01.072 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[704bb877-4200-437e-a0f4-2a03476a3a2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:01 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:01.081 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[718b775f-b251-428f-8354-eb20a9a5b252]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:01 compute-0 podman[220026]: 2026-02-17 17:37:01.102883975 +0000 UTC m=+0.068271532 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 17 17:37:01 compute-0 podman[220024]: 2026-02-17 17:37:01.102884165 +0000 UTC m=+0.067508714 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Feb 17 17:37:01 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:01.103 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[cf71b930-c161-4bdc-a3bb-e5248fd99969]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:01 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:01.115 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[cfb528f9-6ea3-494b-814e-98f2af99c76c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap641e4c07-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:26:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353000, 'reachable_time': 20104, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220075, 'error': None, 'target': 'ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:01 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:01.127 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[55235b4a-e4b2-49a0-a5af-1e9b8b5ac408]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap641e4c07-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353008, 'tstamp': 353008}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220076, 'error': None, 'target': 'ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap641e4c07-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353010, 'tstamp': 353010}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220076, 'error': None, 'target': 'ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:01 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:01.128 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap641e4c07-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.129 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.134 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:01 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:01.134 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap641e4c07-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:37:01 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:01.134 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:37:01 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:01.135 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap641e4c07-20, col_values=(('external_ids', {'iface-id': 'c7627b50-72b7-4f6f-a9ad-2ceb9b8b24e3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:37:01 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:01.135 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.233 186483 INFO nova.virt.libvirt.driver [-] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Instance destroyed successfully.
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.233 186483 DEBUG nova.objects.instance [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'resources' on Instance uuid db5b187e-9b4b-4bcf-a142-465cac63f18a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.250 186483 DEBUG nova.virt.libvirt.vif [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:36:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1141949017',display_name='tempest-TestNetworkBasicOps-server-1141949017',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1141949017',id=12,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMY9kFaPF0ad6/uB/CqSdqudglFsmVzh3fOsOsHzCokheVVxikzrTKMeUfSPm+xkyFvcPpj3vWQpfoC3CKEz3KEMBROnXSpQ3X6qvEUlwaD4nCaIZGb/XIZnqeudDtBPCA==',key_name='tempest-TestNetworkBasicOps-1963528792',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:36:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-nzibojz0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:36:35Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=db5b187e-9b4b-4bcf-a142-465cac63f18a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "address": "fa:16:3e:29:90:91", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape70422a1-99", "ovs_interfaceid": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.250 186483 DEBUG nova.network.os_vif_util [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "address": "fa:16:3e:29:90:91", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape70422a1-99", "ovs_interfaceid": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.251 186483 DEBUG nova.network.os_vif_util [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:29:90:91,bridge_name='br-int',has_traffic_filtering=True,id=e70422a1-9986-4f6b-b8a3-4c7f3a7d7710,network=Network(641e4c07-2901-48bf-a652-443b9ce7f994),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape70422a1-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.251 186483 DEBUG os_vif [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:29:90:91,bridge_name='br-int',has_traffic_filtering=True,id=e70422a1-9986-4f6b-b8a3-4c7f3a7d7710,network=Network(641e4c07-2901-48bf-a652-443b9ce7f994),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape70422a1-99') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.252 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.252 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape70422a1-99, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.253 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.255 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.257 186483 INFO os_vif [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:29:90:91,bridge_name='br-int',has_traffic_filtering=True,id=e70422a1-9986-4f6b-b8a3-4c7f3a7d7710,network=Network(641e4c07-2901-48bf-a652-443b9ce7f994),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape70422a1-99')
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.258 186483 INFO nova.virt.libvirt.driver [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Deleting instance files /var/lib/nova/instances/db5b187e-9b4b-4bcf-a142-465cac63f18a_del
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.258 186483 INFO nova.virt.libvirt.driver [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Deletion of /var/lib/nova/instances/db5b187e-9b4b-4bcf-a142-465cac63f18a_del complete
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.306 186483 INFO nova.compute.manager [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Took 0.33 seconds to destroy the instance on the hypervisor.
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.308 186483 DEBUG oslo.service.loopingcall [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.308 186483 DEBUG nova.compute.manager [-] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.308 186483 DEBUG nova.network.neutron [-] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.953 186483 DEBUG nova.network.neutron [req-5a4768f3-01aa-4727-bb89-2acea5d26623 req-b0d272c1-74cc-41f0-a93d-1815fe8efecc 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Updated VIF entry in instance network info cache for port 2c569f82-9221-46a0-b481-e5d95c02ed5c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.954 186483 DEBUG nova.network.neutron [req-5a4768f3-01aa-4727-bb89-2acea5d26623 req-b0d272c1-74cc-41f0-a93d-1815fe8efecc 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Updating instance_info_cache with network_info: [{"id": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "address": "fa:16:3e:00:92:e0", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c569f82-92", "ovs_interfaceid": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:37:01 compute-0 nova_compute[186479]: 2026-02-17 17:37:01.969 186483 DEBUG oslo_concurrency.lockutils [req-5a4768f3-01aa-4727-bb89-2acea5d26623 req-b0d272c1-74cc-41f0-a93d-1815fe8efecc 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:37:02 compute-0 nova_compute[186479]: 2026-02-17 17:37:02.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:37:02 compute-0 nova_compute[186479]: 2026-02-17 17:37:02.400 186483 DEBUG nova.network.neutron [-] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:37:02 compute-0 nova_compute[186479]: 2026-02-17 17:37:02.427 186483 INFO nova.compute.manager [-] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Took 1.12 seconds to deallocate network for instance.
Feb 17 17:37:02 compute-0 nova_compute[186479]: 2026-02-17 17:37:02.523 186483 DEBUG nova.compute.manager [req-102ccac7-3d9c-4799-81b2-0089431fa8ae req-8461e73e-d15a-4705-885a-de4178ddc752 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Received event network-vif-deleted-e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:37:02 compute-0 nova_compute[186479]: 2026-02-17 17:37:02.539 186483 DEBUG oslo_concurrency.lockutils [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:02 compute-0 nova_compute[186479]: 2026-02-17 17:37:02.540 186483 DEBUG oslo_concurrency.lockutils [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:02 compute-0 nova_compute[186479]: 2026-02-17 17:37:02.554 186483 DEBUG nova.network.neutron [req-d4bc53ed-14c7-40a5-80b6-c04c029eb651 req-e2338d2a-f5b5-4eb9-bc5c-0831fce8a6a3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Updated VIF entry in instance network info cache for port e70422a1-9986-4f6b-b8a3-4c7f3a7d7710. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:37:02 compute-0 nova_compute[186479]: 2026-02-17 17:37:02.554 186483 DEBUG nova.network.neutron [req-d4bc53ed-14c7-40a5-80b6-c04c029eb651 req-e2338d2a-f5b5-4eb9-bc5c-0831fce8a6a3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Updating instance_info_cache with network_info: [{"id": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "address": "fa:16:3e:29:90:91", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape70422a1-99", "ovs_interfaceid": "e70422a1-9986-4f6b-b8a3-4c7f3a7d7710", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:37:02 compute-0 nova_compute[186479]: 2026-02-17 17:37:02.581 186483 DEBUG oslo_concurrency.lockutils [req-d4bc53ed-14c7-40a5-80b6-c04c029eb651 req-e2338d2a-f5b5-4eb9-bc5c-0831fce8a6a3 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-db5b187e-9b4b-4bcf-a142-465cac63f18a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:37:02 compute-0 nova_compute[186479]: 2026-02-17 17:37:02.608 186483 DEBUG nova.compute.provider_tree [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:37:02 compute-0 nova_compute[186479]: 2026-02-17 17:37:02.621 186483 DEBUG nova.scheduler.client.report [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:37:02 compute-0 nova_compute[186479]: 2026-02-17 17:37:02.640 186483 DEBUG oslo_concurrency.lockutils [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:02 compute-0 nova_compute[186479]: 2026-02-17 17:37:02.666 186483 INFO nova.scheduler.client.report [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Deleted allocations for instance db5b187e-9b4b-4bcf-a142-465cac63f18a
Feb 17 17:37:02 compute-0 nova_compute[186479]: 2026-02-17 17:37:02.718 186483 DEBUG oslo_concurrency.lockutils [None req-f032c690-d467-4b56-a686-a04902758ef4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "db5b187e-9b4b-4bcf-a142-465cac63f18a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.742s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:03 compute-0 nova_compute[186479]: 2026-02-17 17:37:03.098 186483 DEBUG nova.compute.manager [req-e7af024a-e6e0-4106-b348-05b985dfcda1 req-c2b40048-26e1-4b27-97ed-657e9ccf01db 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Received event network-vif-unplugged-e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:37:03 compute-0 nova_compute[186479]: 2026-02-17 17:37:03.099 186483 DEBUG oslo_concurrency.lockutils [req-e7af024a-e6e0-4106-b348-05b985dfcda1 req-c2b40048-26e1-4b27-97ed-657e9ccf01db 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "db5b187e-9b4b-4bcf-a142-465cac63f18a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:03 compute-0 nova_compute[186479]: 2026-02-17 17:37:03.099 186483 DEBUG oslo_concurrency.lockutils [req-e7af024a-e6e0-4106-b348-05b985dfcda1 req-c2b40048-26e1-4b27-97ed-657e9ccf01db 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "db5b187e-9b4b-4bcf-a142-465cac63f18a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:03 compute-0 nova_compute[186479]: 2026-02-17 17:37:03.099 186483 DEBUG oslo_concurrency.lockutils [req-e7af024a-e6e0-4106-b348-05b985dfcda1 req-c2b40048-26e1-4b27-97ed-657e9ccf01db 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "db5b187e-9b4b-4bcf-a142-465cac63f18a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:03 compute-0 nova_compute[186479]: 2026-02-17 17:37:03.099 186483 DEBUG nova.compute.manager [req-e7af024a-e6e0-4106-b348-05b985dfcda1 req-c2b40048-26e1-4b27-97ed-657e9ccf01db 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] No waiting events found dispatching network-vif-unplugged-e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:37:03 compute-0 nova_compute[186479]: 2026-02-17 17:37:03.099 186483 WARNING nova.compute.manager [req-e7af024a-e6e0-4106-b348-05b985dfcda1 req-c2b40048-26e1-4b27-97ed-657e9ccf01db 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Received unexpected event network-vif-unplugged-e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 for instance with vm_state deleted and task_state None.
Feb 17 17:37:03 compute-0 nova_compute[186479]: 2026-02-17 17:37:03.100 186483 DEBUG nova.compute.manager [req-e7af024a-e6e0-4106-b348-05b985dfcda1 req-c2b40048-26e1-4b27-97ed-657e9ccf01db 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Received event network-vif-plugged-e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:37:03 compute-0 nova_compute[186479]: 2026-02-17 17:37:03.100 186483 DEBUG oslo_concurrency.lockutils [req-e7af024a-e6e0-4106-b348-05b985dfcda1 req-c2b40048-26e1-4b27-97ed-657e9ccf01db 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "db5b187e-9b4b-4bcf-a142-465cac63f18a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:03 compute-0 nova_compute[186479]: 2026-02-17 17:37:03.100 186483 DEBUG oslo_concurrency.lockutils [req-e7af024a-e6e0-4106-b348-05b985dfcda1 req-c2b40048-26e1-4b27-97ed-657e9ccf01db 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "db5b187e-9b4b-4bcf-a142-465cac63f18a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:03 compute-0 nova_compute[186479]: 2026-02-17 17:37:03.100 186483 DEBUG oslo_concurrency.lockutils [req-e7af024a-e6e0-4106-b348-05b985dfcda1 req-c2b40048-26e1-4b27-97ed-657e9ccf01db 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "db5b187e-9b4b-4bcf-a142-465cac63f18a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:03 compute-0 nova_compute[186479]: 2026-02-17 17:37:03.100 186483 DEBUG nova.compute.manager [req-e7af024a-e6e0-4106-b348-05b985dfcda1 req-c2b40048-26e1-4b27-97ed-657e9ccf01db 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] No waiting events found dispatching network-vif-plugged-e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:37:03 compute-0 nova_compute[186479]: 2026-02-17 17:37:03.100 186483 WARNING nova.compute.manager [req-e7af024a-e6e0-4106-b348-05b985dfcda1 req-c2b40048-26e1-4b27-97ed-657e9ccf01db 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Received unexpected event network-vif-plugged-e70422a1-9986-4f6b-b8a3-4c7f3a7d7710 for instance with vm_state deleted and task_state None.
Feb 17 17:37:03 compute-0 nova_compute[186479]: 2026-02-17 17:37:03.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.568 186483 DEBUG oslo_concurrency.lockutils [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.569 186483 DEBUG oslo_concurrency.lockutils [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.569 186483 DEBUG oslo_concurrency.lockutils [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.570 186483 DEBUG oslo_concurrency.lockutils [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.570 186483 DEBUG oslo_concurrency.lockutils [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.571 186483 INFO nova.compute.manager [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Terminating instance
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.572 186483 DEBUG nova.compute.manager [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 17 17:37:04 compute-0 kernel: tap2c569f82-92 (unregistering): left promiscuous mode
Feb 17 17:37:04 compute-0 NetworkManager[56323]: <info>  [1771349824.5989] device (tap2c569f82-92): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 17 17:37:04 compute-0 ovn_controller[96568]: 2026-02-17T17:37:04Z|00158|binding|INFO|Releasing lport 2c569f82-9221-46a0-b481-e5d95c02ed5c from this chassis (sb_readonly=0)
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.605 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:04 compute-0 ovn_controller[96568]: 2026-02-17T17:37:04Z|00159|binding|INFO|Setting lport 2c569f82-9221-46a0-b481-e5d95c02ed5c down in Southbound
Feb 17 17:37:04 compute-0 ovn_controller[96568]: 2026-02-17T17:37:04Z|00160|binding|INFO|Removing iface tap2c569f82-92 ovn-installed in OVS
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.612 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.615 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:04 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:04.617 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:92:e0 10.100.0.11'], port_security=['fa:16:3e:00:92:e0 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-641e4c07-2901-48bf-a652-443b9ce7f994', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'fbda417a-4d86-41ac-b08e-f696e30a840a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9416c76a-6d88-4459-97f8-aa5f83d3ca8b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=2c569f82-9221-46a0-b481-e5d95c02ed5c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:37:04 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:04.618 105898 INFO neutron.agent.ovn.metadata.agent [-] Port 2c569f82-9221-46a0-b481-e5d95c02ed5c in datapath 641e4c07-2901-48bf-a652-443b9ce7f994 unbound from our chassis
Feb 17 17:37:04 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:04.619 105898 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 641e4c07-2901-48bf-a652-443b9ce7f994, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 17 17:37:04 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:04.620 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[d9f0fc00-fb76-423a-932a-4bd15c9385bf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:04 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:04.621 105898 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994 namespace which is not needed anymore
Feb 17 17:37:04 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Feb 17 17:37:04 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Consumed 13.576s CPU time.
Feb 17 17:37:04 compute-0 systemd-machined[155877]: Machine qemu-11-instance-0000000b terminated.
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.788 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.791 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.816 186483 INFO nova.virt.libvirt.driver [-] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Instance destroyed successfully.
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.817 186483 DEBUG nova.objects.instance [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'resources' on Instance uuid 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:37:04 compute-0 neutron-haproxy-ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994[219730]: [NOTICE]   (219734) : haproxy version is 2.8.14-c23fe91
Feb 17 17:37:04 compute-0 neutron-haproxy-ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994[219730]: [NOTICE]   (219734) : path to executable is /usr/sbin/haproxy
Feb 17 17:37:04 compute-0 neutron-haproxy-ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994[219730]: [WARNING]  (219734) : Exiting Master process...
Feb 17 17:37:04 compute-0 neutron-haproxy-ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994[219730]: [ALERT]    (219734) : Current worker (219736) exited with code 143 (Terminated)
Feb 17 17:37:04 compute-0 neutron-haproxy-ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994[219730]: [WARNING]  (219734) : All workers exited. Exiting... (0)
Feb 17 17:37:04 compute-0 systemd[1]: libpod-3bee9a3f51c5e1bb520a18b226005a347978ea0b7e6749664069d40bd9993c08.scope: Deactivated successfully.
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.828 186483 DEBUG nova.virt.libvirt.vif [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:36:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-583735795',display_name='tempest-TestNetworkBasicOps-server-583735795',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-583735795',id=11,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDZdxdh7Y8sCHr0ByIyzH8BM0AjjH4br5ABCZRoTjXvr3Yn+jp5fn1rx4ummIhhsikquoJBFsBIXS6y0HjczWSj9RZWPZhWjtz7yoGSiC0hUryculZGDU9ynXoq5gn5u4g==',key_name='tempest-TestNetworkBasicOps-1832315980',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:36:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-120jnykj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:36:15Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "address": "fa:16:3e:00:92:e0", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c569f82-92", "ovs_interfaceid": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.829 186483 DEBUG nova.network.os_vif_util [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "address": "fa:16:3e:00:92:e0", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c569f82-92", "ovs_interfaceid": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.829 186483 DEBUG nova.network.os_vif_util [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:00:92:e0,bridge_name='br-int',has_traffic_filtering=True,id=2c569f82-9221-46a0-b481-e5d95c02ed5c,network=Network(641e4c07-2901-48bf-a652-443b9ce7f994),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c569f82-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.829 186483 DEBUG os_vif [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:00:92:e0,bridge_name='br-int',has_traffic_filtering=True,id=2c569f82-9221-46a0-b481-e5d95c02ed5c,network=Network(641e4c07-2901-48bf-a652-443b9ce7f994),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c569f82-92') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.830 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.831 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2c569f82-92, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.832 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.833 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.835 186483 INFO os_vif [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:00:92:e0,bridge_name='br-int',has_traffic_filtering=True,id=2c569f82-9221-46a0-b481-e5d95c02ed5c,network=Network(641e4c07-2901-48bf-a652-443b9ce7f994),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c569f82-92')
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.835 186483 INFO nova.virt.libvirt.driver [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Deleting instance files /var/lib/nova/instances/2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf_del
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.835 186483 INFO nova.virt.libvirt.driver [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Deletion of /var/lib/nova/instances/2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf_del complete
Feb 17 17:37:04 compute-0 podman[220121]: 2026-02-17 17:37:04.836157744 +0000 UTC m=+0.139158424 container died 3bee9a3f51c5e1bb520a18b226005a347978ea0b7e6749664069d40bd9993c08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.880 186483 INFO nova.compute.manager [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Took 0.31 seconds to destroy the instance on the hypervisor.
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.880 186483 DEBUG oslo.service.loopingcall [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.881 186483 DEBUG nova.compute.manager [-] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 17 17:37:04 compute-0 nova_compute[186479]: 2026-02-17 17:37:04.881 186483 DEBUG nova.network.neutron [-] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 17 17:37:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3bee9a3f51c5e1bb520a18b226005a347978ea0b7e6749664069d40bd9993c08-userdata-shm.mount: Deactivated successfully.
Feb 17 17:37:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-b006fb4561ad72bc8c56752cdc428a29754f8d013dcefd51e2976c4f04f68945-merged.mount: Deactivated successfully.
Feb 17 17:37:04 compute-0 podman[220121]: 2026-02-17 17:37:04.951466942 +0000 UTC m=+0.254467622 container cleanup 3bee9a3f51c5e1bb520a18b226005a347978ea0b7e6749664069d40bd9993c08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 17 17:37:04 compute-0 systemd[1]: libpod-conmon-3bee9a3f51c5e1bb520a18b226005a347978ea0b7e6749664069d40bd9993c08.scope: Deactivated successfully.
Feb 17 17:37:05 compute-0 podman[220167]: 2026-02-17 17:37:05.049550914 +0000 UTC m=+0.081144093 container remove 3bee9a3f51c5e1bb520a18b226005a347978ea0b7e6749664069d40bd9993c08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 17 17:37:05 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:05.053 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[6297a9da-b4c1-49e2-ad89-2535e58daf28]: (4, ('Tue Feb 17 05:37:04 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994 (3bee9a3f51c5e1bb520a18b226005a347978ea0b7e6749664069d40bd9993c08)\n3bee9a3f51c5e1bb520a18b226005a347978ea0b7e6749664069d40bd9993c08\nTue Feb 17 05:37:04 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994 (3bee9a3f51c5e1bb520a18b226005a347978ea0b7e6749664069d40bd9993c08)\n3bee9a3f51c5e1bb520a18b226005a347978ea0b7e6749664069d40bd9993c08\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:05 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:05.054 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[bc0fe791-893c-4a52-947d-97320590454c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:05 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:05.055 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap641e4c07-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:37:05 compute-0 nova_compute[186479]: 2026-02-17 17:37:05.057 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:05 compute-0 kernel: tap641e4c07-20: left promiscuous mode
Feb 17 17:37:05 compute-0 nova_compute[186479]: 2026-02-17 17:37:05.060 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:05 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:05.063 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[88677707-5b09-4c56-a0b7-da7747af2844]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:05 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:05.078 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[7a81a691-b40f-4987-a31a-385deedbfd71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:05 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:05.079 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[be55c243-4b44-4f68-8d05-602890eaa013]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:05 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:05.092 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[e7b1e952-041b-41b1-9282-8e5827c484a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 352994, 'reachable_time': 20740, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220182, 'error': None, 'target': 'ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:05 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:05.095 106424 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-641e4c07-2901-48bf-a652-443b9ce7f994 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 17 17:37:05 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:05.095 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[41a18488-023e-4948-aee7-92d3be413256]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:05 compute-0 systemd[1]: run-netns-ovnmeta\x2d641e4c07\x2d2901\x2d48bf\x2da652\x2d443b9ce7f994.mount: Deactivated successfully.
Feb 17 17:37:05 compute-0 nova_compute[186479]: 2026-02-17 17:37:05.192 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:05 compute-0 nova_compute[186479]: 2026-02-17 17:37:05.202 186483 DEBUG nova.compute.manager [req-8a1cd1c0-2bb4-4ba0-875a-8b3f9e2a57aa req-74041972-7cc0-42f0-aac7-ef8a382d18f2 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Received event network-changed-2c569f82-9221-46a0-b481-e5d95c02ed5c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:37:05 compute-0 nova_compute[186479]: 2026-02-17 17:37:05.203 186483 DEBUG nova.compute.manager [req-8a1cd1c0-2bb4-4ba0-875a-8b3f9e2a57aa req-74041972-7cc0-42f0-aac7-ef8a382d18f2 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Refreshing instance network info cache due to event network-changed-2c569f82-9221-46a0-b481-e5d95c02ed5c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:37:05 compute-0 nova_compute[186479]: 2026-02-17 17:37:05.203 186483 DEBUG oslo_concurrency.lockutils [req-8a1cd1c0-2bb4-4ba0-875a-8b3f9e2a57aa req-74041972-7cc0-42f0-aac7-ef8a382d18f2 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:37:05 compute-0 nova_compute[186479]: 2026-02-17 17:37:05.204 186483 DEBUG oslo_concurrency.lockutils [req-8a1cd1c0-2bb4-4ba0-875a-8b3f9e2a57aa req-74041972-7cc0-42f0-aac7-ef8a382d18f2 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:37:05 compute-0 nova_compute[186479]: 2026-02-17 17:37:05.204 186483 DEBUG nova.network.neutron [req-8a1cd1c0-2bb4-4ba0-875a-8b3f9e2a57aa req-74041972-7cc0-42f0-aac7-ef8a382d18f2 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Refreshing network info cache for port 2c569f82-9221-46a0-b481-e5d95c02ed5c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:37:05 compute-0 nova_compute[186479]: 2026-02-17 17:37:05.298 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:37:05 compute-0 nova_compute[186479]: 2026-02-17 17:37:05.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:37:05 compute-0 nova_compute[186479]: 2026-02-17 17:37:05.638 186483 DEBUG nova.network.neutron [-] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:37:05 compute-0 nova_compute[186479]: 2026-02-17 17:37:05.658 186483 INFO nova.compute.manager [-] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Took 0.78 seconds to deallocate network for instance.
Feb 17 17:37:05 compute-0 nova_compute[186479]: 2026-02-17 17:37:05.714 186483 DEBUG oslo_concurrency.lockutils [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:05 compute-0 nova_compute[186479]: 2026-02-17 17:37:05.714 186483 DEBUG oslo_concurrency.lockutils [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:05 compute-0 nova_compute[186479]: 2026-02-17 17:37:05.769 186483 DEBUG nova.compute.provider_tree [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:37:05 compute-0 nova_compute[186479]: 2026-02-17 17:37:05.785 186483 DEBUG nova.scheduler.client.report [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:37:05 compute-0 nova_compute[186479]: 2026-02-17 17:37:05.805 186483 DEBUG oslo_concurrency.lockutils [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:05 compute-0 nova_compute[186479]: 2026-02-17 17:37:05.826 186483 INFO nova.scheduler.client.report [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Deleted allocations for instance 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf
Feb 17 17:37:05 compute-0 nova_compute[186479]: 2026-02-17 17:37:05.905 186483 DEBUG oslo_concurrency.lockutils [None req-971e1ccd-6a94-44b8-8bb4-c4ac9d5064d4 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.336s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.191 186483 DEBUG nova.network.neutron [req-8a1cd1c0-2bb4-4ba0-875a-8b3f9e2a57aa req-74041972-7cc0-42f0-aac7-ef8a382d18f2 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Updated VIF entry in instance network info cache for port 2c569f82-9221-46a0-b481-e5d95c02ed5c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.192 186483 DEBUG nova.network.neutron [req-8a1cd1c0-2bb4-4ba0-875a-8b3f9e2a57aa req-74041972-7cc0-42f0-aac7-ef8a382d18f2 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Updating instance_info_cache with network_info: [{"id": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "address": "fa:16:3e:00:92:e0", "network": {"id": "641e4c07-2901-48bf-a652-443b9ce7f994", "bridge": "br-int", "label": "tempest-network-smoke--1014798980", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c569f82-92", "ovs_interfaceid": "2c569f82-9221-46a0-b481-e5d95c02ed5c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.207 186483 DEBUG oslo_concurrency.lockutils [req-8a1cd1c0-2bb4-4ba0-875a-8b3f9e2a57aa req-74041972-7cc0-42f0-aac7-ef8a382d18f2 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.208 186483 DEBUG nova.compute.manager [req-8a1cd1c0-2bb4-4ba0-875a-8b3f9e2a57aa req-74041972-7cc0-42f0-aac7-ef8a382d18f2 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Received event network-vif-unplugged-2c569f82-9221-46a0-b481-e5d95c02ed5c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.208 186483 DEBUG oslo_concurrency.lockutils [req-8a1cd1c0-2bb4-4ba0-875a-8b3f9e2a57aa req-74041972-7cc0-42f0-aac7-ef8a382d18f2 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.209 186483 DEBUG oslo_concurrency.lockutils [req-8a1cd1c0-2bb4-4ba0-875a-8b3f9e2a57aa req-74041972-7cc0-42f0-aac7-ef8a382d18f2 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.209 186483 DEBUG oslo_concurrency.lockutils [req-8a1cd1c0-2bb4-4ba0-875a-8b3f9e2a57aa req-74041972-7cc0-42f0-aac7-ef8a382d18f2 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.210 186483 DEBUG nova.compute.manager [req-8a1cd1c0-2bb4-4ba0-875a-8b3f9e2a57aa req-74041972-7cc0-42f0-aac7-ef8a382d18f2 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] No waiting events found dispatching network-vif-unplugged-2c569f82-9221-46a0-b481-e5d95c02ed5c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.210 186483 DEBUG nova.compute.manager [req-8a1cd1c0-2bb4-4ba0-875a-8b3f9e2a57aa req-74041972-7cc0-42f0-aac7-ef8a382d18f2 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Received event network-vif-unplugged-2c569f82-9221-46a0-b481-e5d95c02ed5c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.211 186483 DEBUG nova.compute.manager [req-8a1cd1c0-2bb4-4ba0-875a-8b3f9e2a57aa req-74041972-7cc0-42f0-aac7-ef8a382d18f2 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Received event network-vif-plugged-2c569f82-9221-46a0-b481-e5d95c02ed5c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.211 186483 DEBUG oslo_concurrency.lockutils [req-8a1cd1c0-2bb4-4ba0-875a-8b3f9e2a57aa req-74041972-7cc0-42f0-aac7-ef8a382d18f2 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.212 186483 DEBUG oslo_concurrency.lockutils [req-8a1cd1c0-2bb4-4ba0-875a-8b3f9e2a57aa req-74041972-7cc0-42f0-aac7-ef8a382d18f2 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.212 186483 DEBUG oslo_concurrency.lockutils [req-8a1cd1c0-2bb4-4ba0-875a-8b3f9e2a57aa req-74041972-7cc0-42f0-aac7-ef8a382d18f2 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.213 186483 DEBUG nova.compute.manager [req-8a1cd1c0-2bb4-4ba0-875a-8b3f9e2a57aa req-74041972-7cc0-42f0-aac7-ef8a382d18f2 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] No waiting events found dispatching network-vif-plugged-2c569f82-9221-46a0-b481-e5d95c02ed5c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.213 186483 WARNING nova.compute.manager [req-8a1cd1c0-2bb4-4ba0-875a-8b3f9e2a57aa req-74041972-7cc0-42f0-aac7-ef8a382d18f2 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Received unexpected event network-vif-plugged-2c569f82-9221-46a0-b481-e5d95c02ed5c for instance with vm_state active and task_state deleting.
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.324 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.325 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.326 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.326 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.492 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.493 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5738MB free_disk=73.20691299438477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.493 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.494 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.544 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.544 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.564 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.576 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.597 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:37:06 compute-0 nova_compute[186479]: 2026-02-17 17:37:06.598 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:06 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:06.603 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ee0cee2f-3200-4f1f-8903-57b18789347d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:37:07 compute-0 nova_compute[186479]: 2026-02-17 17:37:07.282 186483 DEBUG nova.compute.manager [req-23212cf7-3cc8-4823-8d14-60c5b59deaa0 req-a7ebf40e-dacd-492c-8ce8-a625a0b28eb6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Received event network-vif-deleted-2c569f82-9221-46a0-b481-e5d95c02ed5c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:37:07 compute-0 podman[220185]: 2026-02-17 17:37:07.728250437 +0000 UTC m=+0.059006727 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 17 17:37:08 compute-0 nova_compute[186479]: 2026-02-17 17:37:08.549 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:08 compute-0 nova_compute[186479]: 2026-02-17 17:37:08.567 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:09 compute-0 nova_compute[186479]: 2026-02-17 17:37:09.832 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:10 compute-0 nova_compute[186479]: 2026-02-17 17:37:10.192 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:10.955 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:10.956 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:10.956 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:13 compute-0 nova_compute[186479]: 2026-02-17 17:37:13.598 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:37:13 compute-0 nova_compute[186479]: 2026-02-17 17:37:13.598 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:37:13 compute-0 nova_compute[186479]: 2026-02-17 17:37:13.599 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 17 17:37:13 compute-0 nova_compute[186479]: 2026-02-17 17:37:13.611 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 17 17:37:14 compute-0 nova_compute[186479]: 2026-02-17 17:37:14.837 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:15 compute-0 nova_compute[186479]: 2026-02-17 17:37:15.194 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:15 compute-0 podman[220210]: 2026-02-17 17:37:15.795524083 +0000 UTC m=+0.126105810 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 17 17:37:16 compute-0 nova_compute[186479]: 2026-02-17 17:37:16.232 186483 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771349821.230923, db5b187e-9b4b-4bcf-a142-465cac63f18a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:37:16 compute-0 nova_compute[186479]: 2026-02-17 17:37:16.232 186483 INFO nova.compute.manager [-] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] VM Stopped (Lifecycle Event)
Feb 17 17:37:16 compute-0 nova_compute[186479]: 2026-02-17 17:37:16.266 186483 DEBUG nova.compute.manager [None req-f0241b81-5945-41f3-acdb-52f1db0545fb - - - - - -] [instance: db5b187e-9b4b-4bcf-a142-465cac63f18a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:37:19 compute-0 nova_compute[186479]: 2026-02-17 17:37:19.816 186483 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771349824.8144808, 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:37:19 compute-0 nova_compute[186479]: 2026-02-17 17:37:19.816 186483 INFO nova.compute.manager [-] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] VM Stopped (Lifecycle Event)
Feb 17 17:37:19 compute-0 nova_compute[186479]: 2026-02-17 17:37:19.839 186483 DEBUG nova.compute.manager [None req-e22bcadb-2b03-42c7-b3bc-d92f27d212b4 - - - - - -] [instance: 2c6e94c6-cbb4-4f6c-ba00-19a15f2971bf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:37:19 compute-0 nova_compute[186479]: 2026-02-17 17:37:19.840 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:20 compute-0 nova_compute[186479]: 2026-02-17 17:37:20.196 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:20 compute-0 podman[220237]: 2026-02-17 17:37:20.715177707 +0000 UTC m=+0.055071433 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 17 17:37:24 compute-0 podman[220261]: 2026-02-17 17:37:24.743323726 +0000 UTC m=+0.079876272 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., version=9.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 17 17:37:24 compute-0 nova_compute[186479]: 2026-02-17 17:37:24.842 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:25 compute-0 nova_compute[186479]: 2026-02-17 17:37:25.198 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:29 compute-0 nova_compute[186479]: 2026-02-17 17:37:29.843 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:30 compute-0 nova_compute[186479]: 2026-02-17 17:37:30.228 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:31 compute-0 podman[220283]: 2026-02-17 17:37:31.703030233 +0000 UTC m=+0.044106919 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 17 17:37:31 compute-0 podman[220284]: 2026-02-17 17:37:31.715751661 +0000 UTC m=+0.052082262 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 17 17:37:32 compute-0 nova_compute[186479]: 2026-02-17 17:37:32.652 186483 DEBUG oslo_concurrency.lockutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:32 compute-0 nova_compute[186479]: 2026-02-17 17:37:32.653 186483 DEBUG oslo_concurrency.lockutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:32 compute-0 nova_compute[186479]: 2026-02-17 17:37:32.674 186483 DEBUG nova.compute.manager [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 17 17:37:32 compute-0 nova_compute[186479]: 2026-02-17 17:37:32.760 186483 DEBUG oslo_concurrency.lockutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:32 compute-0 nova_compute[186479]: 2026-02-17 17:37:32.760 186483 DEBUG oslo_concurrency.lockutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:32 compute-0 nova_compute[186479]: 2026-02-17 17:37:32.768 186483 DEBUG nova.virt.hardware [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 17 17:37:32 compute-0 nova_compute[186479]: 2026-02-17 17:37:32.769 186483 INFO nova.compute.claims [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Claim successful on node compute-0.ctlplane.example.com
Feb 17 17:37:32 compute-0 nova_compute[186479]: 2026-02-17 17:37:32.937 186483 DEBUG nova.compute.provider_tree [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:37:32 compute-0 nova_compute[186479]: 2026-02-17 17:37:32.966 186483 DEBUG nova.scheduler.client.report [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:37:32 compute-0 nova_compute[186479]: 2026-02-17 17:37:32.982 186483 DEBUG oslo_concurrency.lockutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:32 compute-0 nova_compute[186479]: 2026-02-17 17:37:32.983 186483 DEBUG nova.compute.manager [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.033 186483 DEBUG nova.compute.manager [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.033 186483 DEBUG nova.network.neutron [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.055 186483 INFO nova.virt.libvirt.driver [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.074 186483 DEBUG nova.compute.manager [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.319 186483 DEBUG nova.compute.manager [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.321 186483 DEBUG nova.virt.libvirt.driver [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.322 186483 INFO nova.virt.libvirt.driver [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Creating image(s)
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.324 186483 DEBUG oslo_concurrency.lockutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "/var/lib/nova/instances/9fbffa09-d91f-4a5b-9cd4-db1243a19e88/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.324 186483 DEBUG oslo_concurrency.lockutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/9fbffa09-d91f-4a5b-9cd4-db1243a19e88/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.325 186483 DEBUG oslo_concurrency.lockutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "/var/lib/nova/instances/9fbffa09-d91f-4a5b-9cd4-db1243a19e88/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.350 186483 DEBUG oslo_concurrency.processutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.400 186483 DEBUG oslo_concurrency.processutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.401 186483 DEBUG oslo_concurrency.lockutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.402 186483 DEBUG oslo_concurrency.lockutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.426 186483 DEBUG oslo_concurrency.processutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.443 186483 DEBUG nova.policy [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3f041abe92134380b8de39091bce5989', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.476 186483 DEBUG oslo_concurrency.processutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.477 186483 DEBUG oslo_concurrency.processutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/9fbffa09-d91f-4a5b-9cd4-db1243a19e88/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.505 186483 DEBUG oslo_concurrency.processutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2,backing_fmt=raw /var/lib/nova/instances/9fbffa09-d91f-4a5b-9cd4-db1243a19e88/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.507 186483 DEBUG oslo_concurrency.lockutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "7b3debffc633fa6c7d3dca092b3f945ffeddd7e2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.507 186483 DEBUG oslo_concurrency.processutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.555 186483 DEBUG oslo_concurrency.processutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/7b3debffc633fa6c7d3dca092b3f945ffeddd7e2 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.557 186483 DEBUG nova.virt.disk.api [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Checking if we can resize image /var/lib/nova/instances/9fbffa09-d91f-4a5b-9cd4-db1243a19e88/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.558 186483 DEBUG oslo_concurrency.processutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fbffa09-d91f-4a5b-9cd4-db1243a19e88/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.638 186483 DEBUG oslo_concurrency.processutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fbffa09-d91f-4a5b-9cd4-db1243a19e88/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.639 186483 DEBUG nova.virt.disk.api [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Cannot resize image /var/lib/nova/instances/9fbffa09-d91f-4a5b-9cd4-db1243a19e88/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.640 186483 DEBUG nova.objects.instance [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'migration_context' on Instance uuid 9fbffa09-d91f-4a5b-9cd4-db1243a19e88 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.674 186483 DEBUG nova.virt.libvirt.driver [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.674 186483 DEBUG nova.virt.libvirt.driver [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Ensure instance console log exists: /var/lib/nova/instances/9fbffa09-d91f-4a5b-9cd4-db1243a19e88/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.675 186483 DEBUG oslo_concurrency.lockutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.676 186483 DEBUG oslo_concurrency.lockutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:33 compute-0 nova_compute[186479]: 2026-02-17 17:37:33.677 186483 DEBUG oslo_concurrency.lockutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:34 compute-0 nova_compute[186479]: 2026-02-17 17:37:34.107 186483 DEBUG nova.network.neutron [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Successfully created port: 1358eab8-69e5-4c5f-8d62-27e45e17fa8a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 17 17:37:34 compute-0 nova_compute[186479]: 2026-02-17 17:37:34.838 186483 DEBUG nova.network.neutron [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Successfully updated port: 1358eab8-69e5-4c5f-8d62-27e45e17fa8a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 17 17:37:34 compute-0 nova_compute[186479]: 2026-02-17 17:37:34.846 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:34 compute-0 nova_compute[186479]: 2026-02-17 17:37:34.857 186483 DEBUG oslo_concurrency.lockutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "refresh_cache-9fbffa09-d91f-4a5b-9cd4-db1243a19e88" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:37:34 compute-0 nova_compute[186479]: 2026-02-17 17:37:34.857 186483 DEBUG oslo_concurrency.lockutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquired lock "refresh_cache-9fbffa09-d91f-4a5b-9cd4-db1243a19e88" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:37:34 compute-0 nova_compute[186479]: 2026-02-17 17:37:34.857 186483 DEBUG nova.network.neutron [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 17 17:37:34 compute-0 nova_compute[186479]: 2026-02-17 17:37:34.920 186483 DEBUG nova.compute.manager [req-3e6ef81d-7e36-4c96-8371-a566247f02b8 req-aea8d831-72f8-41c9-ace7-1a837e27387f 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Received event network-changed-1358eab8-69e5-4c5f-8d62-27e45e17fa8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:37:34 compute-0 nova_compute[186479]: 2026-02-17 17:37:34.921 186483 DEBUG nova.compute.manager [req-3e6ef81d-7e36-4c96-8371-a566247f02b8 req-aea8d831-72f8-41c9-ace7-1a837e27387f 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Refreshing instance network info cache due to event network-changed-1358eab8-69e5-4c5f-8d62-27e45e17fa8a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:37:34 compute-0 nova_compute[186479]: 2026-02-17 17:37:34.921 186483 DEBUG oslo_concurrency.lockutils [req-3e6ef81d-7e36-4c96-8371-a566247f02b8 req-aea8d831-72f8-41c9-ace7-1a837e27387f 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-9fbffa09-d91f-4a5b-9cd4-db1243a19e88" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:37:34 compute-0 nova_compute[186479]: 2026-02-17 17:37:34.996 186483 DEBUG nova.network.neutron [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.230 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.747 186483 DEBUG nova.network.neutron [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Updating instance_info_cache with network_info: [{"id": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "address": "fa:16:3e:15:85:08", "network": {"id": "c865e868-e4b3-46db-a4d6-df6c2e739109", "bridge": "br-int", "label": "tempest-network-smoke--584534185", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1358eab8-69", "ovs_interfaceid": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.778 186483 DEBUG oslo_concurrency.lockutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Releasing lock "refresh_cache-9fbffa09-d91f-4a5b-9cd4-db1243a19e88" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.779 186483 DEBUG nova.compute.manager [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Instance network_info: |[{"id": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "address": "fa:16:3e:15:85:08", "network": {"id": "c865e868-e4b3-46db-a4d6-df6c2e739109", "bridge": "br-int", "label": "tempest-network-smoke--584534185", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1358eab8-69", "ovs_interfaceid": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.780 186483 DEBUG oslo_concurrency.lockutils [req-3e6ef81d-7e36-4c96-8371-a566247f02b8 req-aea8d831-72f8-41c9-ace7-1a837e27387f 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-9fbffa09-d91f-4a5b-9cd4-db1243a19e88" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.780 186483 DEBUG nova.network.neutron [req-3e6ef81d-7e36-4c96-8371-a566247f02b8 req-aea8d831-72f8-41c9-ace7-1a837e27387f 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Refreshing network info cache for port 1358eab8-69e5-4c5f-8d62-27e45e17fa8a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.785 186483 DEBUG nova.virt.libvirt.driver [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Start _get_guest_xml network_info=[{"id": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "address": "fa:16:3e:15:85:08", "network": {"id": "c865e868-e4b3-46db-a4d6-df6c2e739109", "bridge": "br-int", "label": "tempest-network-smoke--584534185", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1358eab8-69", "ovs_interfaceid": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_format': None, 'encryption_options': None, 'size': 0, 'encrypted': False, 'image_id': '4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.790 186483 WARNING nova.virt.libvirt.driver [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.795 186483 DEBUG nova.virt.libvirt.host [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.796 186483 DEBUG nova.virt.libvirt.host [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.800 186483 DEBUG nova.virt.libvirt.host [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.800 186483 DEBUG nova.virt.libvirt.host [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.801 186483 DEBUG nova.virt.libvirt.driver [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.801 186483 DEBUG nova.virt.hardware [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-17T17:28:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='5ebd41ec-8360-4181-bce3-0c0dc586cdb2',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-17T17:28:07Z,direct_url=<?>,disk_format='qcow2',id=4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='02ed9754ecd847a6a89524591c01aa73',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-17T17:28:08Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.802 186483 DEBUG nova.virt.hardware [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.802 186483 DEBUG nova.virt.hardware [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.802 186483 DEBUG nova.virt.hardware [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.803 186483 DEBUG nova.virt.hardware [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.803 186483 DEBUG nova.virt.hardware [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.803 186483 DEBUG nova.virt.hardware [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.803 186483 DEBUG nova.virt.hardware [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.804 186483 DEBUG nova.virt.hardware [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.804 186483 DEBUG nova.virt.hardware [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.804 186483 DEBUG nova.virt.hardware [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.809 186483 DEBUG nova.virt.libvirt.vif [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:37:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1618356044',display_name='tempest-TestNetworkBasicOps-server-1618356044',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1618356044',id=13,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAjQmULcqKhsDZFloeG+5/btusIEnBrPujxK/2cYvYRTFgh2kVKyTQB/92xag37zwEO0bktN7Cm+U8jisxmvTC7qfYDwo6uNuZQK1rDz5JKRWzOf8xpCtVioqdsTDnkjWw==',key_name='tempest-TestNetworkBasicOps-77023403',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-kx7ngm10',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:37:33Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=9fbffa09-d91f-4a5b-9cd4-db1243a19e88,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "address": "fa:16:3e:15:85:08", "network": {"id": "c865e868-e4b3-46db-a4d6-df6c2e739109", "bridge": "br-int", "label": "tempest-network-smoke--584534185", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1358eab8-69", "ovs_interfaceid": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.809 186483 DEBUG nova.network.os_vif_util [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "address": "fa:16:3e:15:85:08", "network": {"id": "c865e868-e4b3-46db-a4d6-df6c2e739109", "bridge": "br-int", "label": "tempest-network-smoke--584534185", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1358eab8-69", "ovs_interfaceid": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.810 186483 DEBUG nova.network.os_vif_util [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:85:08,bridge_name='br-int',has_traffic_filtering=True,id=1358eab8-69e5-4c5f-8d62-27e45e17fa8a,network=Network(c865e868-e4b3-46db-a4d6-df6c2e739109),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1358eab8-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.812 186483 DEBUG nova.objects.instance [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'pci_devices' on Instance uuid 9fbffa09-d91f-4a5b-9cd4-db1243a19e88 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.828 186483 DEBUG nova.virt.libvirt.driver [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] End _get_guest_xml xml=<domain type="kvm">
Feb 17 17:37:35 compute-0 nova_compute[186479]:   <uuid>9fbffa09-d91f-4a5b-9cd4-db1243a19e88</uuid>
Feb 17 17:37:35 compute-0 nova_compute[186479]:   <name>instance-0000000d</name>
Feb 17 17:37:35 compute-0 nova_compute[186479]:   <memory>131072</memory>
Feb 17 17:37:35 compute-0 nova_compute[186479]:   <vcpu>1</vcpu>
Feb 17 17:37:35 compute-0 nova_compute[186479]:   <metadata>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <nova:name>tempest-TestNetworkBasicOps-server-1618356044</nova:name>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <nova:creationTime>2026-02-17 17:37:35</nova:creationTime>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <nova:flavor name="m1.nano">
Feb 17 17:37:35 compute-0 nova_compute[186479]:         <nova:memory>128</nova:memory>
Feb 17 17:37:35 compute-0 nova_compute[186479]:         <nova:disk>1</nova:disk>
Feb 17 17:37:35 compute-0 nova_compute[186479]:         <nova:swap>0</nova:swap>
Feb 17 17:37:35 compute-0 nova_compute[186479]:         <nova:ephemeral>0</nova:ephemeral>
Feb 17 17:37:35 compute-0 nova_compute[186479]:         <nova:vcpus>1</nova:vcpus>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       </nova:flavor>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <nova:owner>
Feb 17 17:37:35 compute-0 nova_compute[186479]:         <nova:user uuid="3f041abe92134380b8de39091bce5989">tempest-TestNetworkBasicOps-1366681220-project-member</nova:user>
Feb 17 17:37:35 compute-0 nova_compute[186479]:         <nova:project uuid="b751dc9f26b74fe7b2bdea4718093b3c">tempest-TestNetworkBasicOps-1366681220</nova:project>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       </nova:owner>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <nova:root type="image" uuid="4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <nova:ports>
Feb 17 17:37:35 compute-0 nova_compute[186479]:         <nova:port uuid="1358eab8-69e5-4c5f-8d62-27e45e17fa8a">
Feb 17 17:37:35 compute-0 nova_compute[186479]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:         </nova:port>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       </nova:ports>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     </nova:instance>
Feb 17 17:37:35 compute-0 nova_compute[186479]:   </metadata>
Feb 17 17:37:35 compute-0 nova_compute[186479]:   <sysinfo type="smbios">
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <system>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <entry name="manufacturer">RDO</entry>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <entry name="product">OpenStack Compute</entry>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <entry name="serial">9fbffa09-d91f-4a5b-9cd4-db1243a19e88</entry>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <entry name="uuid">9fbffa09-d91f-4a5b-9cd4-db1243a19e88</entry>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <entry name="family">Virtual Machine</entry>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     </system>
Feb 17 17:37:35 compute-0 nova_compute[186479]:   </sysinfo>
Feb 17 17:37:35 compute-0 nova_compute[186479]:   <os>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <boot dev="hd"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <smbios mode="sysinfo"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:   </os>
Feb 17 17:37:35 compute-0 nova_compute[186479]:   <features>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <acpi/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <apic/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <vmcoreinfo/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:   </features>
Feb 17 17:37:35 compute-0 nova_compute[186479]:   <clock offset="utc">
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <timer name="pit" tickpolicy="delay"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <timer name="hpet" present="no"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:   </clock>
Feb 17 17:37:35 compute-0 nova_compute[186479]:   <cpu mode="host-model" match="exact">
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <topology sockets="1" cores="1" threads="1"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:   </cpu>
Feb 17 17:37:35 compute-0 nova_compute[186479]:   <devices>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <disk type="file" device="disk">
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/9fbffa09-d91f-4a5b-9cd4-db1243a19e88/disk"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <target dev="vda" bus="virtio"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <disk type="file" device="cdrom">
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <driver name="qemu" type="raw" cache="none"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <source file="/var/lib/nova/instances/9fbffa09-d91f-4a5b-9cd4-db1243a19e88/disk.config"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <target dev="sda" bus="sata"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     </disk>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <interface type="ethernet">
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <mac address="fa:16:3e:15:85:08"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <driver name="vhost" rx_queue_size="512"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <mtu size="1442"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <target dev="tap1358eab8-69"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     </interface>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <serial type="pty">
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <log file="/var/lib/nova/instances/9fbffa09-d91f-4a5b-9cd4-db1243a19e88/console.log" append="off"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     </serial>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <video>
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <model type="virtio"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     </video>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <input type="tablet" bus="usb"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <rng model="virtio">
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <backend model="random">/dev/urandom</backend>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     </rng>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="pci" model="pcie-root-port"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <controller type="usb" index="0"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     <memballoon model="virtio">
Feb 17 17:37:35 compute-0 nova_compute[186479]:       <stats period="10"/>
Feb 17 17:37:35 compute-0 nova_compute[186479]:     </memballoon>
Feb 17 17:37:35 compute-0 nova_compute[186479]:   </devices>
Feb 17 17:37:35 compute-0 nova_compute[186479]: </domain>
Feb 17 17:37:35 compute-0 nova_compute[186479]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.829 186483 DEBUG nova.compute.manager [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Preparing to wait for external event network-vif-plugged-1358eab8-69e5-4c5f-8d62-27e45e17fa8a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.830 186483 DEBUG oslo_concurrency.lockutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.830 186483 DEBUG oslo_concurrency.lockutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.830 186483 DEBUG oslo_concurrency.lockutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.831 186483 DEBUG nova.virt.libvirt.vif [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-17T17:37:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1618356044',display_name='tempest-TestNetworkBasicOps-server-1618356044',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1618356044',id=13,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAjQmULcqKhsDZFloeG+5/btusIEnBrPujxK/2cYvYRTFgh2kVKyTQB/92xag37zwEO0bktN7Cm+U8jisxmvTC7qfYDwo6uNuZQK1rDz5JKRWzOf8xpCtVioqdsTDnkjWw==',key_name='tempest-TestNetworkBasicOps-77023403',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-kx7ngm10',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-17T17:37:33Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=9fbffa09-d91f-4a5b-9cd4-db1243a19e88,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "address": "fa:16:3e:15:85:08", "network": {"id": "c865e868-e4b3-46db-a4d6-df6c2e739109", "bridge": "br-int", "label": "tempest-network-smoke--584534185", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1358eab8-69", "ovs_interfaceid": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.832 186483 DEBUG nova.network.os_vif_util [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "address": "fa:16:3e:15:85:08", "network": {"id": "c865e868-e4b3-46db-a4d6-df6c2e739109", "bridge": "br-int", "label": "tempest-network-smoke--584534185", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1358eab8-69", "ovs_interfaceid": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.832 186483 DEBUG nova.network.os_vif_util [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:85:08,bridge_name='br-int',has_traffic_filtering=True,id=1358eab8-69e5-4c5f-8d62-27e45e17fa8a,network=Network(c865e868-e4b3-46db-a4d6-df6c2e739109),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1358eab8-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.833 186483 DEBUG os_vif [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:85:08,bridge_name='br-int',has_traffic_filtering=True,id=1358eab8-69e5-4c5f-8d62-27e45e17fa8a,network=Network(c865e868-e4b3-46db-a4d6-df6c2e739109),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1358eab8-69') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.833 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.834 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.834 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.838 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.838 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1358eab8-69, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.838 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1358eab8-69, col_values=(('external_ids', {'iface-id': '1358eab8-69e5-4c5f-8d62-27e45e17fa8a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:15:85:08', 'vm-uuid': '9fbffa09-d91f-4a5b-9cd4-db1243a19e88'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.840 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:35 compute-0 NetworkManager[56323]: <info>  [1771349855.8417] manager: (tap1358eab8-69): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.842 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.846 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.847 186483 INFO os_vif [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:85:08,bridge_name='br-int',has_traffic_filtering=True,id=1358eab8-69e5-4c5f-8d62-27e45e17fa8a,network=Network(c865e868-e4b3-46db-a4d6-df6c2e739109),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1358eab8-69')
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.891 186483 DEBUG nova.virt.libvirt.driver [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.892 186483 DEBUG nova.virt.libvirt.driver [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.892 186483 DEBUG nova.virt.libvirt.driver [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] No VIF found with MAC fa:16:3e:15:85:08, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 17 17:37:35 compute-0 nova_compute[186479]: 2026-02-17 17:37:35.893 186483 INFO nova.virt.libvirt.driver [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Using config drive
Feb 17 17:37:36 compute-0 nova_compute[186479]: 2026-02-17 17:37:36.455 186483 INFO nova.virt.libvirt.driver [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Creating config drive at /var/lib/nova/instances/9fbffa09-d91f-4a5b-9cd4-db1243a19e88/disk.config
Feb 17 17:37:36 compute-0 nova_compute[186479]: 2026-02-17 17:37:36.463 186483 DEBUG oslo_concurrency.processutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9fbffa09-d91f-4a5b-9cd4-db1243a19e88/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpkh5zzp4o execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:37:36 compute-0 nova_compute[186479]: 2026-02-17 17:37:36.582 186483 DEBUG oslo_concurrency.processutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9fbffa09-d91f-4a5b-9cd4-db1243a19e88/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpkh5zzp4o" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:37:36 compute-0 kernel: tap1358eab8-69: entered promiscuous mode
Feb 17 17:37:36 compute-0 NetworkManager[56323]: <info>  [1771349856.6414] manager: (tap1358eab8-69): new Tun device (/org/freedesktop/NetworkManager/Devices/87)
Feb 17 17:37:36 compute-0 ovn_controller[96568]: 2026-02-17T17:37:36Z|00161|binding|INFO|Claiming lport 1358eab8-69e5-4c5f-8d62-27e45e17fa8a for this chassis.
Feb 17 17:37:36 compute-0 ovn_controller[96568]: 2026-02-17T17:37:36Z|00162|binding|INFO|1358eab8-69e5-4c5f-8d62-27e45e17fa8a: Claiming fa:16:3e:15:85:08 10.100.0.11
Feb 17 17:37:36 compute-0 nova_compute[186479]: 2026-02-17 17:37:36.643 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:36 compute-0 nova_compute[186479]: 2026-02-17 17:37:36.650 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.662 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:85:08 10.100.0.11'], port_security=['fa:16:3e:15:85:08 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '9fbffa09-d91f-4a5b-9cd4-db1243a19e88', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c865e868-e4b3-46db-a4d6-df6c2e739109', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'afffdd06-546b-4c25-8b84-bc6108cdd488', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9e903a7-0c6a-47d2-ba74-58fea5aeb287, chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=1358eab8-69e5-4c5f-8d62-27e45e17fa8a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.664 105898 INFO neutron.agent.ovn.metadata.agent [-] Port 1358eab8-69e5-4c5f-8d62-27e45e17fa8a in datapath c865e868-e4b3-46db-a4d6-df6c2e739109 bound to our chassis
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.666 105898 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c865e868-e4b3-46db-a4d6-df6c2e739109
Feb 17 17:37:36 compute-0 nova_compute[186479]: 2026-02-17 17:37:36.673 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:36 compute-0 ovn_controller[96568]: 2026-02-17T17:37:36Z|00163|binding|INFO|Setting lport 1358eab8-69e5-4c5f-8d62-27e45e17fa8a ovn-installed in OVS
Feb 17 17:37:36 compute-0 ovn_controller[96568]: 2026-02-17T17:37:36Z|00164|binding|INFO|Setting lport 1358eab8-69e5-4c5f-8d62-27e45e17fa8a up in Southbound
Feb 17 17:37:36 compute-0 systemd-udevd[220356]: Network interface NamePolicy= disabled on kernel command line.
Feb 17 17:37:36 compute-0 systemd-machined[155877]: New machine qemu-13-instance-0000000d.
Feb 17 17:37:36 compute-0 nova_compute[186479]: 2026-02-17 17:37:36.679 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.682 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[2d0171e6-270b-4d37-b406-824d9c9d3c25]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.685 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc865e868-e1 in ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.687 215208 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc865e868-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.687 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[f76d61e1-9c94-41cd-89f1-4aca16b3389a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.688 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[7e722b57-fceb-43e5-a22a-2517bbf151a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:36 compute-0 NetworkManager[56323]: <info>  [1771349856.6973] device (tap1358eab8-69): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 17 17:37:36 compute-0 NetworkManager[56323]: <info>  [1771349856.6983] device (tap1358eab8-69): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 17 17:37:36 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-0000000d.
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.699 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[b920a07c-051a-4a1d-8322-59c93daecd21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.715 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[33efbbd1-4a54-463a-b51b-913a69740f19]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.737 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[e5e3aa9b-fbca-4489-b4ba-272588969102]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:36 compute-0 NetworkManager[56323]: <info>  [1771349856.7446] manager: (tapc865e868-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/88)
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.743 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[6be23119-7f38-40a6-9a4c-ae54c2718427]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.777 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[670c5bfc-73f9-4bc2-9b01-5fe675aacf74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.781 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[e645d924-f3b1-49fc-af61-384b9fde6cc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:36 compute-0 NetworkManager[56323]: <info>  [1771349856.7995] device (tapc865e868-e0): carrier: link connected
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.804 215222 DEBUG oslo.privsep.daemon [-] privsep: reply[df02b9f9-284a-4ffd-a3a7-eba15f372e0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.819 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[39dff51b-b3db-4b3f-a892-24d61d9d0d75]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc865e868-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e1:17:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 361210, 'reachable_time': 37049, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220389, 'error': None, 'target': 'ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.828 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[8ed244fb-a288-4813-b857-4d0e4da3fda8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee1:171b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 361210, 'tstamp': 361210}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220390, 'error': None, 'target': 'ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.838 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[307738ae-5122-4cbc-8831-fd3e85d8bc27]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc865e868-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e1:17:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 361210, 'reachable_time': 37049, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220391, 'error': None, 'target': 'ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.854 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[836ef4ee-273b-4e12-af5f-f031153af302]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.886 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[4280f1c0-bac7-450a-be47-228d9171b87a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.888 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc865e868-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.888 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.889 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc865e868-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:37:36 compute-0 NetworkManager[56323]: <info>  [1771349856.8916] manager: (tapc865e868-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Feb 17 17:37:36 compute-0 kernel: tapc865e868-e0: entered promiscuous mode
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.893 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc865e868-e0, col_values=(('external_ids', {'iface-id': '98cb5b3b-47a3-4c0b-be97-0aa9b9fc1391'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:37:36 compute-0 ovn_controller[96568]: 2026-02-17T17:37:36Z|00165|binding|INFO|Releasing lport 98cb5b3b-47a3-4c0b-be97-0aa9b9fc1391 from this chassis (sb_readonly=0)
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.896 105898 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c865e868-e4b3-46db-a4d6-df6c2e739109.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c865e868-e4b3-46db-a4d6-df6c2e739109.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.898 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[93fc4f2c-313a-411b-ace2-7f8705887f01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.899 105898 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: global
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:     log         /dev/log local0 debug
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:     log-tag     haproxy-metadata-proxy-c865e868-e4b3-46db-a4d6-df6c2e739109
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:     user        root
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:     group       root
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:     maxconn     1024
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:     pidfile     /var/lib/neutron/external/pids/c865e868-e4b3-46db-a4d6-df6c2e739109.pid.haproxy
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:     daemon
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: defaults
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:     log global
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:     mode http
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:     option httplog
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:     option dontlognull
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:     option http-server-close
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:     option forwardfor
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:     retries                 3
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:     timeout http-request    30s
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:     timeout connect         30s
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:     timeout client          32s
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:     timeout server          32s
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:     timeout http-keep-alive 30s
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: listen listener
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:     bind 169.254.169.254:80
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:     server metadata /var/lib/neutron/metadata_proxy
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:     http-request add-header X-OVN-Network-ID c865e868-e4b3-46db-a4d6-df6c2e739109
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 17 17:37:36 compute-0 nova_compute[186479]: 2026-02-17 17:37:36.890 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:36 compute-0 nova_compute[186479]: 2026-02-17 17:37:36.901 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:36 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:36.902 105898 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109', 'env', 'PROCESS_TAG=haproxy-c865e868-e4b3-46db-a4d6-df6c2e739109', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c865e868-e4b3-46db-a4d6-df6c2e739109.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 17 17:37:36 compute-0 nova_compute[186479]: 2026-02-17 17:37:36.963 186483 DEBUG nova.compute.manager [req-b67aa129-e1bb-40d7-ae88-b6f04edb6d20 req-974cd244-1463-4397-8c2e-8417c16c3a3e 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Received event network-vif-plugged-1358eab8-69e5-4c5f-8d62-27e45e17fa8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:37:36 compute-0 nova_compute[186479]: 2026-02-17 17:37:36.964 186483 DEBUG oslo_concurrency.lockutils [req-b67aa129-e1bb-40d7-ae88-b6f04edb6d20 req-974cd244-1463-4397-8c2e-8417c16c3a3e 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:36 compute-0 nova_compute[186479]: 2026-02-17 17:37:36.965 186483 DEBUG oslo_concurrency.lockutils [req-b67aa129-e1bb-40d7-ae88-b6f04edb6d20 req-974cd244-1463-4397-8c2e-8417c16c3a3e 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:36 compute-0 nova_compute[186479]: 2026-02-17 17:37:36.965 186483 DEBUG oslo_concurrency.lockutils [req-b67aa129-e1bb-40d7-ae88-b6f04edb6d20 req-974cd244-1463-4397-8c2e-8417c16c3a3e 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:36 compute-0 nova_compute[186479]: 2026-02-17 17:37:36.966 186483 DEBUG nova.compute.manager [req-b67aa129-e1bb-40d7-ae88-b6f04edb6d20 req-974cd244-1463-4397-8c2e-8417c16c3a3e 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Processing event network-vif-plugged-1358eab8-69e5-4c5f-8d62-27e45e17fa8a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.071 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349857.0715525, 9fbffa09-d91f-4a5b-9cd4-db1243a19e88 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.073 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] VM Started (Lifecycle Event)
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.076 186483 DEBUG nova.compute.manager [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.087 186483 DEBUG nova.network.neutron [req-3e6ef81d-7e36-4c96-8371-a566247f02b8 req-aea8d831-72f8-41c9-ace7-1a837e27387f 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Updated VIF entry in instance network info cache for port 1358eab8-69e5-4c5f-8d62-27e45e17fa8a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.088 186483 DEBUG nova.network.neutron [req-3e6ef81d-7e36-4c96-8371-a566247f02b8 req-aea8d831-72f8-41c9-ace7-1a837e27387f 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Updating instance_info_cache with network_info: [{"id": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "address": "fa:16:3e:15:85:08", "network": {"id": "c865e868-e4b3-46db-a4d6-df6c2e739109", "bridge": "br-int", "label": "tempest-network-smoke--584534185", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1358eab8-69", "ovs_interfaceid": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.090 186483 DEBUG nova.virt.libvirt.driver [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.094 186483 INFO nova.virt.libvirt.driver [-] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Instance spawned successfully.
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.094 186483 DEBUG nova.virt.libvirt.driver [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.130 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.132 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.134 186483 DEBUG oslo_concurrency.lockutils [req-3e6ef81d-7e36-4c96-8371-a566247f02b8 req-aea8d831-72f8-41c9-ace7-1a837e27387f 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-9fbffa09-d91f-4a5b-9cd4-db1243a19e88" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.182 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.183 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349857.0717516, 9fbffa09-d91f-4a5b-9cd4-db1243a19e88 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.184 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] VM Paused (Lifecycle Event)
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.192 186483 DEBUG nova.virt.libvirt.driver [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.192 186483 DEBUG nova.virt.libvirt.driver [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.193 186483 DEBUG nova.virt.libvirt.driver [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.194 186483 DEBUG nova.virt.libvirt.driver [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.195 186483 DEBUG nova.virt.libvirt.driver [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.196 186483 DEBUG nova.virt.libvirt.driver [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.205 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.209 186483 DEBUG nova.virt.driver [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] Emitting event <LifecycleEvent: 1771349857.0785751, 9fbffa09-d91f-4a5b-9cd4-db1243a19e88 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.210 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] VM Resumed (Lifecycle Event)
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.254 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:37:37 compute-0 podman[220427]: 2026-02-17 17:37:37.257316012 +0000 UTC m=+0.054148010 container create 8de9b304f440878b5a9eac70ab02b2daa1cd5cdb40f613e843dfb0906ba8ab28 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.263 186483 DEBUG nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 17 17:37:37 compute-0 systemd[1]: Started libpod-conmon-8de9b304f440878b5a9eac70ab02b2daa1cd5cdb40f613e843dfb0906ba8ab28.scope.
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.315 186483 INFO nova.compute.manager [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Took 4.00 seconds to spawn the instance on the hypervisor.
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.316 186483 DEBUG nova.compute.manager [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.319 186483 INFO nova.compute.manager [None req-9de5855c-4252-4cfc-8686-158d38d6285d - - - - - -] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 17 17:37:37 compute-0 systemd[1]: Started libcrun container.
Feb 17 17:37:37 compute-0 podman[220427]: 2026-02-17 17:37:37.227410838 +0000 UTC m=+0.024242916 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 17 17:37:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8bd0146b74a92bb7d878ed8c8bc7156eb17a26874a3abed73f92346c34d169c3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 17 17:37:37 compute-0 podman[220427]: 2026-02-17 17:37:37.344886129 +0000 UTC m=+0.141718217 container init 8de9b304f440878b5a9eac70ab02b2daa1cd5cdb40f613e843dfb0906ba8ab28 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 17 17:37:37 compute-0 podman[220427]: 2026-02-17 17:37:37.350950405 +0000 UTC m=+0.147782433 container start 8de9b304f440878b5a9eac70ab02b2daa1cd5cdb40f613e843dfb0906ba8ab28 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 17 17:37:37 compute-0 neutron-haproxy-ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109[220442]: [NOTICE]   (220446) : New worker (220448) forked
Feb 17 17:37:37 compute-0 neutron-haproxy-ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109[220442]: [NOTICE]   (220446) : Loading success.
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.391 186483 INFO nova.compute.manager [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Took 4.67 seconds to build instance.
Feb 17 17:37:37 compute-0 nova_compute[186479]: 2026-02-17 17:37:37.406 186483 DEBUG oslo_concurrency.lockutils [None req-24f4d99c-a7c8-47b2-8878-9bf095890269 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.753s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:38 compute-0 podman[220457]: 2026-02-17 17:37:38.732767054 +0000 UTC m=+0.071051008 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 17 17:37:39 compute-0 nova_compute[186479]: 2026-02-17 17:37:39.064 186483 DEBUG nova.compute.manager [req-f0cf0433-c358-41c6-ac82-cf4e4345375a req-fa873653-9d3a-4836-ac80-fd85778a4753 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Received event network-vif-plugged-1358eab8-69e5-4c5f-8d62-27e45e17fa8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:37:39 compute-0 nova_compute[186479]: 2026-02-17 17:37:39.064 186483 DEBUG oslo_concurrency.lockutils [req-f0cf0433-c358-41c6-ac82-cf4e4345375a req-fa873653-9d3a-4836-ac80-fd85778a4753 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:37:39 compute-0 nova_compute[186479]: 2026-02-17 17:37:39.064 186483 DEBUG oslo_concurrency.lockutils [req-f0cf0433-c358-41c6-ac82-cf4e4345375a req-fa873653-9d3a-4836-ac80-fd85778a4753 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:37:39 compute-0 nova_compute[186479]: 2026-02-17 17:37:39.065 186483 DEBUG oslo_concurrency.lockutils [req-f0cf0433-c358-41c6-ac82-cf4e4345375a req-fa873653-9d3a-4836-ac80-fd85778a4753 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:37:39 compute-0 nova_compute[186479]: 2026-02-17 17:37:39.065 186483 DEBUG nova.compute.manager [req-f0cf0433-c358-41c6-ac82-cf4e4345375a req-fa873653-9d3a-4836-ac80-fd85778a4753 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] No waiting events found dispatching network-vif-plugged-1358eab8-69e5-4c5f-8d62-27e45e17fa8a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:37:39 compute-0 nova_compute[186479]: 2026-02-17 17:37:39.065 186483 WARNING nova.compute.manager [req-f0cf0433-c358-41c6-ac82-cf4e4345375a req-fa873653-9d3a-4836-ac80-fd85778a4753 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Received unexpected event network-vif-plugged-1358eab8-69e5-4c5f-8d62-27e45e17fa8a for instance with vm_state active and task_state None.
Feb 17 17:37:40 compute-0 nova_compute[186479]: 2026-02-17 17:37:40.233 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:40 compute-0 nova_compute[186479]: 2026-02-17 17:37:40.679 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:40 compute-0 ovn_controller[96568]: 2026-02-17T17:37:40Z|00166|binding|INFO|Releasing lport 98cb5b3b-47a3-4c0b-be97-0aa9b9fc1391 from this chassis (sb_readonly=0)
Feb 17 17:37:40 compute-0 NetworkManager[56323]: <info>  [1771349860.6848] manager: (patch-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Feb 17 17:37:40 compute-0 NetworkManager[56323]: <info>  [1771349860.6857] manager: (patch-br-int-to-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Feb 17 17:37:40 compute-0 ovn_controller[96568]: 2026-02-17T17:37:40Z|00167|binding|INFO|Releasing lport 98cb5b3b-47a3-4c0b-be97-0aa9b9fc1391 from this chassis (sb_readonly=0)
Feb 17 17:37:40 compute-0 nova_compute[186479]: 2026-02-17 17:37:40.688 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:40 compute-0 nova_compute[186479]: 2026-02-17 17:37:40.693 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:40 compute-0 nova_compute[186479]: 2026-02-17 17:37:40.841 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:41 compute-0 nova_compute[186479]: 2026-02-17 17:37:41.543 186483 DEBUG nova.compute.manager [req-ef7ea939-a7b9-423b-b361-7c0ba91b393e req-d57a0ea2-3603-4ee4-afcf-90d5661adb91 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Received event network-changed-1358eab8-69e5-4c5f-8d62-27e45e17fa8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:37:41 compute-0 nova_compute[186479]: 2026-02-17 17:37:41.544 186483 DEBUG nova.compute.manager [req-ef7ea939-a7b9-423b-b361-7c0ba91b393e req-d57a0ea2-3603-4ee4-afcf-90d5661adb91 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Refreshing instance network info cache due to event network-changed-1358eab8-69e5-4c5f-8d62-27e45e17fa8a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:37:41 compute-0 nova_compute[186479]: 2026-02-17 17:37:41.544 186483 DEBUG oslo_concurrency.lockutils [req-ef7ea939-a7b9-423b-b361-7c0ba91b393e req-d57a0ea2-3603-4ee4-afcf-90d5661adb91 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-9fbffa09-d91f-4a5b-9cd4-db1243a19e88" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:37:41 compute-0 nova_compute[186479]: 2026-02-17 17:37:41.545 186483 DEBUG oslo_concurrency.lockutils [req-ef7ea939-a7b9-423b-b361-7c0ba91b393e req-d57a0ea2-3603-4ee4-afcf-90d5661adb91 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-9fbffa09-d91f-4a5b-9cd4-db1243a19e88" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:37:41 compute-0 nova_compute[186479]: 2026-02-17 17:37:41.545 186483 DEBUG nova.network.neutron [req-ef7ea939-a7b9-423b-b361-7c0ba91b393e req-d57a0ea2-3603-4ee4-afcf-90d5661adb91 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Refreshing network info cache for port 1358eab8-69e5-4c5f-8d62-27e45e17fa8a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:37:43 compute-0 nova_compute[186479]: 2026-02-17 17:37:43.055 186483 DEBUG nova.network.neutron [req-ef7ea939-a7b9-423b-b361-7c0ba91b393e req-d57a0ea2-3603-4ee4-afcf-90d5661adb91 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Updated VIF entry in instance network info cache for port 1358eab8-69e5-4c5f-8d62-27e45e17fa8a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:37:43 compute-0 nova_compute[186479]: 2026-02-17 17:37:43.055 186483 DEBUG nova.network.neutron [req-ef7ea939-a7b9-423b-b361-7c0ba91b393e req-d57a0ea2-3603-4ee4-afcf-90d5661adb91 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Updating instance_info_cache with network_info: [{"id": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "address": "fa:16:3e:15:85:08", "network": {"id": "c865e868-e4b3-46db-a4d6-df6c2e739109", "bridge": "br-int", "label": "tempest-network-smoke--584534185", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1358eab8-69", "ovs_interfaceid": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:37:43 compute-0 nova_compute[186479]: 2026-02-17 17:37:43.080 186483 DEBUG oslo_concurrency.lockutils [req-ef7ea939-a7b9-423b-b361-7c0ba91b393e req-d57a0ea2-3603-4ee4-afcf-90d5661adb91 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-9fbffa09-d91f-4a5b-9cd4-db1243a19e88" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:37:45 compute-0 nova_compute[186479]: 2026-02-17 17:37:45.235 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:45 compute-0 nova_compute[186479]: 2026-02-17 17:37:45.843 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:46 compute-0 podman[220483]: 2026-02-17 17:37:46.759762385 +0000 UTC m=+0.098524392 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Feb 17 17:37:49 compute-0 ovn_controller[96568]: 2026-02-17T17:37:49Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:15:85:08 10.100.0.11
Feb 17 17:37:49 compute-0 ovn_controller[96568]: 2026-02-17T17:37:49Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:15:85:08 10.100.0.11
Feb 17 17:37:50 compute-0 nova_compute[186479]: 2026-02-17 17:37:50.238 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:50 compute-0 nova_compute[186479]: 2026-02-17 17:37:50.845 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:51 compute-0 podman[220520]: 2026-02-17 17:37:51.708153545 +0000 UTC m=+0.049042397 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 17 17:37:55 compute-0 nova_compute[186479]: 2026-02-17 17:37:55.264 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:55 compute-0 podman[220545]: 2026-02-17 17:37:55.747214437 +0000 UTC m=+0.089794781 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, architecture=x86_64, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 17 17:37:55 compute-0 nova_compute[186479]: 2026-02-17 17:37:55.847 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:55 compute-0 nova_compute[186479]: 2026-02-17 17:37:55.939 186483 INFO nova.compute.manager [None req-9dae0bb9-5281-448d-8a0b-9bc97e04f44b 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Get console output
Feb 17 17:37:55 compute-0 nova_compute[186479]: 2026-02-17 17:37:55.946 215112 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 17 17:37:56 compute-0 ovn_controller[96568]: 2026-02-17T17:37:56Z|00168|binding|INFO|Releasing lport 98cb5b3b-47a3-4c0b-be97-0aa9b9fc1391 from this chassis (sb_readonly=0)
Feb 17 17:37:56 compute-0 nova_compute[186479]: 2026-02-17 17:37:56.695 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:56 compute-0 ovn_controller[96568]: 2026-02-17T17:37:56Z|00169|binding|INFO|Releasing lport 98cb5b3b-47a3-4c0b-be97-0aa9b9fc1391 from this chassis (sb_readonly=0)
Feb 17 17:37:56 compute-0 nova_compute[186479]: 2026-02-17 17:37:56.717 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:57 compute-0 nova_compute[186479]: 2026-02-17 17:37:57.895 186483 INFO nova.compute.manager [None req-d949e71f-72c6-450a-9dbf-164591c6b251 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Get console output
Feb 17 17:37:57 compute-0 nova_compute[186479]: 2026-02-17 17:37:57.900 215112 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 17 17:37:58 compute-0 NetworkManager[56323]: <info>  [1771349878.9471] manager: (patch-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Feb 17 17:37:58 compute-0 nova_compute[186479]: 2026-02-17 17:37:58.946 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:58 compute-0 NetworkManager[56323]: <info>  [1771349878.9480] manager: (patch-br-int-to-provnet-6e4e24a3-f9cf-459d-ae00-877a9bac3999): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Feb 17 17:37:58 compute-0 ovn_controller[96568]: 2026-02-17T17:37:58Z|00170|binding|INFO|Releasing lport 98cb5b3b-47a3-4c0b-be97-0aa9b9fc1391 from this chassis (sb_readonly=0)
Feb 17 17:37:58 compute-0 nova_compute[186479]: 2026-02-17 17:37:58.956 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:58 compute-0 nova_compute[186479]: 2026-02-17 17:37:58.963 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:59 compute-0 nova_compute[186479]: 2026-02-17 17:37:59.264 186483 INFO nova.compute.manager [None req-8ac3a4b4-3834-4c83-b2f4-e617992716d0 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Get console output
Feb 17 17:37:59 compute-0 nova_compute[186479]: 2026-02-17 17:37:59.269 215112 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 17 17:37:59 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:59.759 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '7a:bf:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:a1:c8:7d:f2:63'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:37:59 compute-0 nova_compute[186479]: 2026-02-17 17:37:59.759 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:37:59 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:37:59.760 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.115 186483 DEBUG nova.compute.manager [req-462b03ca-e068-4b4f-a770-a1bf3b964922 req-d44efd86-394e-4af6-8286-072a20c84437 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Received event network-changed-1358eab8-69e5-4c5f-8d62-27e45e17fa8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.115 186483 DEBUG nova.compute.manager [req-462b03ca-e068-4b4f-a770-a1bf3b964922 req-d44efd86-394e-4af6-8286-072a20c84437 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Refreshing instance network info cache due to event network-changed-1358eab8-69e5-4c5f-8d62-27e45e17fa8a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.116 186483 DEBUG oslo_concurrency.lockutils [req-462b03ca-e068-4b4f-a770-a1bf3b964922 req-d44efd86-394e-4af6-8286-072a20c84437 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "refresh_cache-9fbffa09-d91f-4a5b-9cd4-db1243a19e88" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.116 186483 DEBUG oslo_concurrency.lockutils [req-462b03ca-e068-4b4f-a770-a1bf3b964922 req-d44efd86-394e-4af6-8286-072a20c84437 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquired lock "refresh_cache-9fbffa09-d91f-4a5b-9cd4-db1243a19e88" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.116 186483 DEBUG nova.network.neutron [req-462b03ca-e068-4b4f-a770-a1bf3b964922 req-d44efd86-394e-4af6-8286-072a20c84437 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Refreshing network info cache for port 1358eab8-69e5-4c5f-8d62-27e45e17fa8a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.151 186483 DEBUG oslo_concurrency.lockutils [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.152 186483 DEBUG oslo_concurrency.lockutils [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.152 186483 DEBUG oslo_concurrency.lockutils [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.152 186483 DEBUG oslo_concurrency.lockutils [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.153 186483 DEBUG oslo_concurrency.lockutils [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.153 186483 INFO nova.compute.manager [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Terminating instance
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.154 186483 DEBUG nova.compute.manager [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 17 17:38:00 compute-0 kernel: tap1358eab8-69 (unregistering): left promiscuous mode
Feb 17 17:38:00 compute-0 NetworkManager[56323]: <info>  [1771349880.1883] device (tap1358eab8-69): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.191 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:00 compute-0 ovn_controller[96568]: 2026-02-17T17:38:00Z|00171|binding|INFO|Releasing lport 1358eab8-69e5-4c5f-8d62-27e45e17fa8a from this chassis (sb_readonly=0)
Feb 17 17:38:00 compute-0 ovn_controller[96568]: 2026-02-17T17:38:00Z|00172|binding|INFO|Setting lport 1358eab8-69e5-4c5f-8d62-27e45e17fa8a down in Southbound
Feb 17 17:38:00 compute-0 ovn_controller[96568]: 2026-02-17T17:38:00Z|00173|binding|INFO|Removing iface tap1358eab8-69 ovn-installed in OVS
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.194 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:38:00.199 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:85:08 10.100.0.11'], port_security=['fa:16:3e:15:85:08 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '9fbffa09-d91f-4a5b-9cd4-db1243a19e88', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c865e868-e4b3-46db-a4d6-df6c2e739109', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b751dc9f26b74fe7b2bdea4718093b3c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'afffdd06-546b-4c25-8b84-bc6108cdd488', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9e903a7-0c6a-47d2-ba74-58fea5aeb287, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>], logical_port=1358eab8-69e5-4c5f-8d62-27e45e17fa8a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f18a6697df0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:38:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:38:00.200 105898 INFO neutron.agent.ovn.metadata.agent [-] Port 1358eab8-69e5-4c5f-8d62-27e45e17fa8a in datapath c865e868-e4b3-46db-a4d6-df6c2e739109 unbound from our chassis
Feb 17 17:38:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:38:00.201 105898 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c865e868-e4b3-46db-a4d6-df6c2e739109, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 17 17:38:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:38:00.202 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[b31b08cd-7b85-4a30-8d50-797d25ac1763]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:38:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:38:00.203 105898 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109 namespace which is not needed anymore
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.205 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:00 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Feb 17 17:38:00 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Consumed 12.389s CPU time.
Feb 17 17:38:00 compute-0 systemd-machined[155877]: Machine qemu-13-instance-0000000d terminated.
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.266 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:00 compute-0 neutron-haproxy-ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109[220442]: [NOTICE]   (220446) : haproxy version is 2.8.14-c23fe91
Feb 17 17:38:00 compute-0 neutron-haproxy-ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109[220442]: [NOTICE]   (220446) : path to executable is /usr/sbin/haproxy
Feb 17 17:38:00 compute-0 neutron-haproxy-ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109[220442]: [WARNING]  (220446) : Exiting Master process...
Feb 17 17:38:00 compute-0 neutron-haproxy-ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109[220442]: [ALERT]    (220446) : Current worker (220448) exited with code 143 (Terminated)
Feb 17 17:38:00 compute-0 neutron-haproxy-ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109[220442]: [WARNING]  (220446) : All workers exited. Exiting... (0)
Feb 17 17:38:00 compute-0 systemd[1]: libpod-8de9b304f440878b5a9eac70ab02b2daa1cd5cdb40f613e843dfb0906ba8ab28.scope: Deactivated successfully.
Feb 17 17:38:00 compute-0 podman[220593]: 2026-02-17 17:38:00.316565003 +0000 UTC m=+0.042878158 container died 8de9b304f440878b5a9eac70ab02b2daa1cd5cdb40f613e843dfb0906ba8ab28 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 17 17:38:00 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8de9b304f440878b5a9eac70ab02b2daa1cd5cdb40f613e843dfb0906ba8ab28-userdata-shm.mount: Deactivated successfully.
Feb 17 17:38:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-8bd0146b74a92bb7d878ed8c8bc7156eb17a26874a3abed73f92346c34d169c3-merged.mount: Deactivated successfully.
Feb 17 17:38:00 compute-0 podman[220593]: 2026-02-17 17:38:00.360984997 +0000 UTC m=+0.087298152 container cleanup 8de9b304f440878b5a9eac70ab02b2daa1cd5cdb40f613e843dfb0906ba8ab28 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.372 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.377 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:00 compute-0 systemd[1]: libpod-conmon-8de9b304f440878b5a9eac70ab02b2daa1cd5cdb40f613e843dfb0906ba8ab28.scope: Deactivated successfully.
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.405 186483 INFO nova.virt.libvirt.driver [-] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Instance destroyed successfully.
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.406 186483 DEBUG nova.objects.instance [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lazy-loading 'resources' on Instance uuid 9fbffa09-d91f-4a5b-9cd4-db1243a19e88 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 17 17:38:00 compute-0 podman[220623]: 2026-02-17 17:38:00.418875556 +0000 UTC m=+0.041082324 container remove 8de9b304f440878b5a9eac70ab02b2daa1cd5cdb40f613e843dfb0906ba8ab28 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.419 186483 DEBUG nova.virt.libvirt.vif [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-17T17:37:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1618356044',display_name='tempest-TestNetworkBasicOps-server-1618356044',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1618356044',id=13,image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAjQmULcqKhsDZFloeG+5/btusIEnBrPujxK/2cYvYRTFgh2kVKyTQB/92xag37zwEO0bktN7Cm+U8jisxmvTC7qfYDwo6uNuZQK1rDz5JKRWzOf8xpCtVioqdsTDnkjWw==',key_name='tempest-TestNetworkBasicOps-77023403',keypairs=<?>,launch_index=0,launched_at=2026-02-17T17:37:37Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b751dc9f26b74fe7b2bdea4718093b3c',ramdisk_id='',reservation_id='r-kx7ngm10',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='4bdf9e7b-0bb3-4689-ab5a-7f21ae341e98',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1366681220',owner_user_name='tempest-TestNetworkBasicOps-1366681220-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-17T17:37:37Z,user_data=None,user_id='3f041abe92134380b8de39091bce5989',uuid=9fbffa09-d91f-4a5b-9cd4-db1243a19e88,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "address": "fa:16:3e:15:85:08", "network": {"id": "c865e868-e4b3-46db-a4d6-df6c2e739109", "bridge": "br-int", "label": "tempest-network-smoke--584534185", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1358eab8-69", "ovs_interfaceid": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.419 186483 DEBUG nova.network.os_vif_util [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converting VIF {"id": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "address": "fa:16:3e:15:85:08", "network": {"id": "c865e868-e4b3-46db-a4d6-df6c2e739109", "bridge": "br-int", "label": "tempest-network-smoke--584534185", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1358eab8-69", "ovs_interfaceid": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.420 186483 DEBUG nova.network.os_vif_util [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:15:85:08,bridge_name='br-int',has_traffic_filtering=True,id=1358eab8-69e5-4c5f-8d62-27e45e17fa8a,network=Network(c865e868-e4b3-46db-a4d6-df6c2e739109),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1358eab8-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.420 186483 DEBUG os_vif [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:85:08,bridge_name='br-int',has_traffic_filtering=True,id=1358eab8-69e5-4c5f-8d62-27e45e17fa8a,network=Network(c865e868-e4b3-46db-a4d6-df6c2e739109),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1358eab8-69') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.422 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.422 186483 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1358eab8-69, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:38:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:38:00.422 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[133c56fc-1451-4e0b-acfa-20d5bff6c090]: (4, ('Tue Feb 17 05:38:00 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109 (8de9b304f440878b5a9eac70ab02b2daa1cd5cdb40f613e843dfb0906ba8ab28)\n8de9b304f440878b5a9eac70ab02b2daa1cd5cdb40f613e843dfb0906ba8ab28\nTue Feb 17 05:38:00 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109 (8de9b304f440878b5a9eac70ab02b2daa1cd5cdb40f613e843dfb0906ba8ab28)\n8de9b304f440878b5a9eac70ab02b2daa1cd5cdb40f613e843dfb0906ba8ab28\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.423 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.424 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:38:00.425 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[7d312b09-4025-49ca-8068-c41e0881c76e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:38:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:38:00.426 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc865e868-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.427 186483 INFO os_vif [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:85:08,bridge_name='br-int',has_traffic_filtering=True,id=1358eab8-69e5-4c5f-8d62-27e45e17fa8a,network=Network(c865e868-e4b3-46db-a4d6-df6c2e739109),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1358eab8-69')
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.427 186483 INFO nova.virt.libvirt.driver [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Deleting instance files /var/lib/nova/instances/9fbffa09-d91f-4a5b-9cd4-db1243a19e88_del
Feb 17 17:38:00 compute-0 kernel: tapc865e868-e0: left promiscuous mode
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.427 186483 INFO nova.virt.libvirt.driver [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Deletion of /var/lib/nova/instances/9fbffa09-d91f-4a5b-9cd4-db1243a19e88_del complete
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.430 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.433 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:38:00.435 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[df7169be-f52b-4da4-a2a1-3094e983dd48]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:38:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:38:00.454 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[844aea55-5157-4539-b412-094f34a520c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:38:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:38:00.455 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[73924f00-3394-4cbd-98a3-f2bb0c7d244b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:38:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:38:00.465 215208 DEBUG oslo.privsep.daemon [-] privsep: reply[01e59a8b-3805-4128-bcef-0207ee26c611]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 361204, 'reachable_time': 15956, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220656, 'error': None, 'target': 'ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:38:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:38:00.467 106424 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c865e868-e4b3-46db-a4d6-df6c2e739109 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 17 17:38:00 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:38:00.467 106424 DEBUG oslo.privsep.daemon [-] privsep: reply[d392cec9-f8ef-48d2-8986-562421b8094a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 17 17:38:00 compute-0 systemd[1]: run-netns-ovnmeta\x2dc865e868\x2de4b3\x2d46db\x2da4d6\x2ddf6c2e739109.mount: Deactivated successfully.
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.486 186483 INFO nova.compute.manager [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Took 0.33 seconds to destroy the instance on the hypervisor.
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.487 186483 DEBUG oslo.service.loopingcall [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.487 186483 DEBUG nova.compute.manager [-] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.487 186483 DEBUG nova.network.neutron [-] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.496 186483 DEBUG nova.compute.manager [req-35c6a532-1397-40b1-a5d5-bec7e1e0c88d req-91d15326-d347-4f0d-a45a-578520e84ad6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Received event network-vif-unplugged-1358eab8-69e5-4c5f-8d62-27e45e17fa8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.496 186483 DEBUG oslo_concurrency.lockutils [req-35c6a532-1397-40b1-a5d5-bec7e1e0c88d req-91d15326-d347-4f0d-a45a-578520e84ad6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.496 186483 DEBUG oslo_concurrency.lockutils [req-35c6a532-1397-40b1-a5d5-bec7e1e0c88d req-91d15326-d347-4f0d-a45a-578520e84ad6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.496 186483 DEBUG oslo_concurrency.lockutils [req-35c6a532-1397-40b1-a5d5-bec7e1e0c88d req-91d15326-d347-4f0d-a45a-578520e84ad6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.496 186483 DEBUG nova.compute.manager [req-35c6a532-1397-40b1-a5d5-bec7e1e0c88d req-91d15326-d347-4f0d-a45a-578520e84ad6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] No waiting events found dispatching network-vif-unplugged-1358eab8-69e5-4c5f-8d62-27e45e17fa8a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:38:00 compute-0 nova_compute[186479]: 2026-02-17 17:38:00.497 186483 DEBUG nova.compute.manager [req-35c6a532-1397-40b1-a5d5-bec7e1e0c88d req-91d15326-d347-4f0d-a45a-578520e84ad6 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Received event network-vif-unplugged-1358eab8-69e5-4c5f-8d62-27e45e17fa8a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 17 17:38:01 compute-0 nova_compute[186479]: 2026-02-17 17:38:01.080 186483 DEBUG nova.network.neutron [-] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:38:01 compute-0 nova_compute[186479]: 2026-02-17 17:38:01.096 186483 INFO nova.compute.manager [-] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Took 0.61 seconds to deallocate network for instance.
Feb 17 17:38:01 compute-0 nova_compute[186479]: 2026-02-17 17:38:01.148 186483 DEBUG oslo_concurrency.lockutils [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:38:01 compute-0 nova_compute[186479]: 2026-02-17 17:38:01.149 186483 DEBUG oslo_concurrency.lockutils [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:38:01 compute-0 nova_compute[186479]: 2026-02-17 17:38:01.206 186483 DEBUG nova.compute.provider_tree [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:38:01 compute-0 nova_compute[186479]: 2026-02-17 17:38:01.220 186483 DEBUG nova.scheduler.client.report [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:38:01 compute-0 nova_compute[186479]: 2026-02-17 17:38:01.240 186483 DEBUG oslo_concurrency.lockutils [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:38:01 compute-0 nova_compute[186479]: 2026-02-17 17:38:01.280 186483 INFO nova.scheduler.client.report [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Deleted allocations for instance 9fbffa09-d91f-4a5b-9cd4-db1243a19e88
Feb 17 17:38:01 compute-0 nova_compute[186479]: 2026-02-17 17:38:01.343 186483 DEBUG oslo_concurrency.lockutils [None req-1fbbb9be-d5a5-4843-8a66-7e8974cb19d7 3f041abe92134380b8de39091bce5989 b751dc9f26b74fe7b2bdea4718093b3c - - default default] Lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:38:01 compute-0 nova_compute[186479]: 2026-02-17 17:38:01.460 186483 DEBUG nova.network.neutron [req-462b03ca-e068-4b4f-a770-a1bf3b964922 req-d44efd86-394e-4af6-8286-072a20c84437 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Updated VIF entry in instance network info cache for port 1358eab8-69e5-4c5f-8d62-27e45e17fa8a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 17 17:38:01 compute-0 nova_compute[186479]: 2026-02-17 17:38:01.461 186483 DEBUG nova.network.neutron [req-462b03ca-e068-4b4f-a770-a1bf3b964922 req-d44efd86-394e-4af6-8286-072a20c84437 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Updating instance_info_cache with network_info: [{"id": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "address": "fa:16:3e:15:85:08", "network": {"id": "c865e868-e4b3-46db-a4d6-df6c2e739109", "bridge": "br-int", "label": "tempest-network-smoke--584534185", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b751dc9f26b74fe7b2bdea4718093b3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1358eab8-69", "ovs_interfaceid": "1358eab8-69e5-4c5f-8d62-27e45e17fa8a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 17 17:38:01 compute-0 nova_compute[186479]: 2026-02-17 17:38:01.492 186483 DEBUG oslo_concurrency.lockutils [req-462b03ca-e068-4b4f-a770-a1bf3b964922 req-d44efd86-394e-4af6-8286-072a20c84437 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Releasing lock "refresh_cache-9fbffa09-d91f-4a5b-9cd4-db1243a19e88" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 17 17:38:02 compute-0 nova_compute[186479]: 2026-02-17 17:38:02.072 186483 DEBUG nova.compute.manager [req-7cab9337-d126-46cc-948c-847ab0fa98eb req-8528028e-b92a-40e9-9540-2a68b03857de 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Received event network-vif-deleted-1358eab8-69e5-4c5f-8d62-27e45e17fa8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:38:02 compute-0 nova_compute[186479]: 2026-02-17 17:38:02.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:38:02 compute-0 nova_compute[186479]: 2026-02-17 17:38:02.569 186483 DEBUG nova.compute.manager [req-d01bd594-a02b-472f-812b-bf3fe5dd9bf6 req-a31eb15e-113d-46ba-9ac3-afdd74d61b50 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Received event network-vif-plugged-1358eab8-69e5-4c5f-8d62-27e45e17fa8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 17 17:38:02 compute-0 nova_compute[186479]: 2026-02-17 17:38:02.569 186483 DEBUG oslo_concurrency.lockutils [req-d01bd594-a02b-472f-812b-bf3fe5dd9bf6 req-a31eb15e-113d-46ba-9ac3-afdd74d61b50 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Acquiring lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:38:02 compute-0 nova_compute[186479]: 2026-02-17 17:38:02.570 186483 DEBUG oslo_concurrency.lockutils [req-d01bd594-a02b-472f-812b-bf3fe5dd9bf6 req-a31eb15e-113d-46ba-9ac3-afdd74d61b50 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:38:02 compute-0 nova_compute[186479]: 2026-02-17 17:38:02.570 186483 DEBUG oslo_concurrency.lockutils [req-d01bd594-a02b-472f-812b-bf3fe5dd9bf6 req-a31eb15e-113d-46ba-9ac3-afdd74d61b50 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] Lock "9fbffa09-d91f-4a5b-9cd4-db1243a19e88-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:38:02 compute-0 nova_compute[186479]: 2026-02-17 17:38:02.570 186483 DEBUG nova.compute.manager [req-d01bd594-a02b-472f-812b-bf3fe5dd9bf6 req-a31eb15e-113d-46ba-9ac3-afdd74d61b50 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] No waiting events found dispatching network-vif-plugged-1358eab8-69e5-4c5f-8d62-27e45e17fa8a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 17 17:38:02 compute-0 nova_compute[186479]: 2026-02-17 17:38:02.570 186483 WARNING nova.compute.manager [req-d01bd594-a02b-472f-812b-bf3fe5dd9bf6 req-a31eb15e-113d-46ba-9ac3-afdd74d61b50 9ea87d58e850421a9bdd58793ae6edd1 cb3b7f5bdb5742b9add143eec675c5a2 - - default default] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Received unexpected event network-vif-plugged-1358eab8-69e5-4c5f-8d62-27e45e17fa8a for instance with vm_state deleted and task_state None.
Feb 17 17:38:02 compute-0 podman[220658]: 2026-02-17 17:38:02.723750112 +0000 UTC m=+0.064346357 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:38:02 compute-0 podman[220657]: 2026-02-17 17:38:02.727736708 +0000 UTC m=+0.066491789 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 17 17:38:04 compute-0 nova_compute[186479]: 2026-02-17 17:38:04.156 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:04 compute-0 nova_compute[186479]: 2026-02-17 17:38:04.193 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:05 compute-0 nova_compute[186479]: 2026-02-17 17:38:05.269 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:05 compute-0 nova_compute[186479]: 2026-02-17 17:38:05.298 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:38:05 compute-0 nova_compute[186479]: 2026-02-17 17:38:05.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:38:05 compute-0 nova_compute[186479]: 2026-02-17 17:38:05.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:38:05 compute-0 nova_compute[186479]: 2026-02-17 17:38:05.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:38:05 compute-0 nova_compute[186479]: 2026-02-17 17:38:05.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:38:05 compute-0 nova_compute[186479]: 2026-02-17 17:38:05.303 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:38:05 compute-0 nova_compute[186479]: 2026-02-17 17:38:05.425 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:06 compute-0 nova_compute[186479]: 2026-02-17 17:38:06.305 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:38:07 compute-0 nova_compute[186479]: 2026-02-17 17:38:07.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:38:07 compute-0 nova_compute[186479]: 2026-02-17 17:38:07.327 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:38:07 compute-0 nova_compute[186479]: 2026-02-17 17:38:07.327 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:38:07 compute-0 nova_compute[186479]: 2026-02-17 17:38:07.328 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:38:07 compute-0 nova_compute[186479]: 2026-02-17 17:38:07.328 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:38:07 compute-0 nova_compute[186479]: 2026-02-17 17:38:07.472 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:38:07 compute-0 nova_compute[186479]: 2026-02-17 17:38:07.473 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5743MB free_disk=73.2068977355957GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:38:07 compute-0 nova_compute[186479]: 2026-02-17 17:38:07.473 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:38:07 compute-0 nova_compute[186479]: 2026-02-17 17:38:07.473 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:38:07 compute-0 nova_compute[186479]: 2026-02-17 17:38:07.536 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:38:07 compute-0 nova_compute[186479]: 2026-02-17 17:38:07.536 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:38:07 compute-0 nova_compute[186479]: 2026-02-17 17:38:07.562 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:38:07 compute-0 nova_compute[186479]: 2026-02-17 17:38:07.580 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:38:07 compute-0 nova_compute[186479]: 2026-02-17 17:38:07.596 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:38:07 compute-0 nova_compute[186479]: 2026-02-17 17:38:07.597 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:38:09 compute-0 podman[220700]: 2026-02-17 17:38:09.720059545 +0000 UTC m=+0.060387901 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 17 17:38:09 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:38:09.763 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ee0cee2f-3200-4f1f-8903-57b18789347d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:38:10 compute-0 nova_compute[186479]: 2026-02-17 17:38:10.272 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:10 compute-0 nova_compute[186479]: 2026-02-17 17:38:10.428 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:10 compute-0 nova_compute[186479]: 2026-02-17 17:38:10.590 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:38:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:38:10.957 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:38:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:38:10.957 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:38:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:38:10.957 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:38:15 compute-0 nova_compute[186479]: 2026-02-17 17:38:15.274 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:15 compute-0 nova_compute[186479]: 2026-02-17 17:38:15.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:38:15 compute-0 nova_compute[186479]: 2026-02-17 17:38:15.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:38:15 compute-0 nova_compute[186479]: 2026-02-17 17:38:15.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 17 17:38:15 compute-0 nova_compute[186479]: 2026-02-17 17:38:15.321 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 17 17:38:15 compute-0 nova_compute[186479]: 2026-02-17 17:38:15.405 186483 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771349880.4038424, 9fbffa09-d91f-4a5b-9cd4-db1243a19e88 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 17 17:38:15 compute-0 nova_compute[186479]: 2026-02-17 17:38:15.405 186483 INFO nova.compute.manager [-] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] VM Stopped (Lifecycle Event)
Feb 17 17:38:15 compute-0 nova_compute[186479]: 2026-02-17 17:38:15.429 186483 DEBUG nova.compute.manager [None req-a3349d9b-c191-439a-894d-e2461c386b13 - - - - - -] [instance: 9fbffa09-d91f-4a5b-9cd4-db1243a19e88] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 17 17:38:15 compute-0 nova_compute[186479]: 2026-02-17 17:38:15.430 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:17 compute-0 podman[220725]: 2026-02-17 17:38:17.741799679 +0000 UTC m=+0.079468743 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Feb 17 17:38:20 compute-0 nova_compute[186479]: 2026-02-17 17:38:20.274 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:20 compute-0 nova_compute[186479]: 2026-02-17 17:38:20.432 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:22 compute-0 podman[220753]: 2026-02-17 17:38:22.730940892 +0000 UTC m=+0.075968497 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 17 17:38:25 compute-0 nova_compute[186479]: 2026-02-17 17:38:25.276 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:25 compute-0 nova_compute[186479]: 2026-02-17 17:38:25.434 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:26 compute-0 podman[220779]: 2026-02-17 17:38:26.736145268 +0000 UTC m=+0.080557779 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, vendor=Red Hat, Inc., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, distribution-scope=public)
Feb 17 17:38:30 compute-0 nova_compute[186479]: 2026-02-17 17:38:30.279 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:30 compute-0 nova_compute[186479]: 2026-02-17 17:38:30.436 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:33 compute-0 podman[220800]: 2026-02-17 17:38:33.722336636 +0000 UTC m=+0.062021951 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 17 17:38:33 compute-0 podman[220801]: 2026-02-17 17:38:33.749276977 +0000 UTC m=+0.080472377 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Feb 17 17:38:34 compute-0 ovn_controller[96568]: 2026-02-17T17:38:34Z|00174|memory_trim|INFO|Detected inactivity (last active 30013 ms ago): trimming memory
Feb 17 17:38:35 compute-0 nova_compute[186479]: 2026-02-17 17:38:35.332 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:35 compute-0 nova_compute[186479]: 2026-02-17 17:38:35.437 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:40 compute-0 nova_compute[186479]: 2026-02-17 17:38:40.333 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:40 compute-0 nova_compute[186479]: 2026-02-17 17:38:40.439 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:40 compute-0 podman[220840]: 2026-02-17 17:38:40.716632721 +0000 UTC m=+0.052487669 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:38:43.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:38:45 compute-0 nova_compute[186479]: 2026-02-17 17:38:45.384 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:45 compute-0 nova_compute[186479]: 2026-02-17 17:38:45.441 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:48 compute-0 podman[220866]: 2026-02-17 17:38:48.734419111 +0000 UTC m=+0.077079366 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 17 17:38:50 compute-0 nova_compute[186479]: 2026-02-17 17:38:50.386 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:50 compute-0 nova_compute[186479]: 2026-02-17 17:38:50.443 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:53 compute-0 podman[220892]: 2026-02-17 17:38:53.740949645 +0000 UTC m=+0.083017169 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 17 17:38:55 compute-0 nova_compute[186479]: 2026-02-17 17:38:55.389 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:55 compute-0 nova_compute[186479]: 2026-02-17 17:38:55.445 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:38:57 compute-0 nova_compute[186479]: 2026-02-17 17:38:57.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:38:57 compute-0 nova_compute[186479]: 2026-02-17 17:38:57.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 17 17:38:57 compute-0 podman[220916]: 2026-02-17 17:38:57.707015844 +0000 UTC m=+0.048292909 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, name=ubi9/ubi-minimal, release=1770267347, distribution-scope=public, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.openshift.expose-services=, version=9.7, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 17 17:39:00 compute-0 nova_compute[186479]: 2026-02-17 17:39:00.390 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:00 compute-0 nova_compute[186479]: 2026-02-17 17:39:00.447 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:01 compute-0 sshd-session[220936]: Accepted publickey for zuul from 192.168.122.10 port 45400 ssh2: ECDSA SHA256:+U7vSwQZvFvLlKGEfHWMlLBUV1LwgLTIxJVbiTi7h3A
Feb 17 17:39:01 compute-0 systemd-logind[806]: New session 26 of user zuul.
Feb 17 17:39:01 compute-0 systemd[1]: Started Session 26 of User zuul.
Feb 17 17:39:01 compute-0 sshd-session[220936]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 17:39:01 compute-0 sudo[220940]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Feb 17 17:39:01 compute-0 sudo[220940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:39:02 compute-0 nova_compute[186479]: 2026-02-17 17:39:02.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:39:03 compute-0 nova_compute[186479]: 2026-02-17 17:39:03.562 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:39:03 compute-0 nova_compute[186479]: 2026-02-17 17:39:03.563 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:39:03 compute-0 nova_compute[186479]: 2026-02-17 17:39:03.563 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 17 17:39:03 compute-0 nova_compute[186479]: 2026-02-17 17:39:03.588 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 17 17:39:04 compute-0 podman[221099]: 2026-02-17 17:39:04.716527495 +0000 UTC m=+0.059631672 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 17 17:39:04 compute-0 podman[221100]: 2026-02-17 17:39:04.7299594 +0000 UTC m=+0.065960825 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 17 17:39:05 compute-0 nova_compute[186479]: 2026-02-17 17:39:05.392 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:05 compute-0 nova_compute[186479]: 2026-02-17 17:39:05.449 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:06 compute-0 nova_compute[186479]: 2026-02-17 17:39:06.329 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:39:06 compute-0 nova_compute[186479]: 2026-02-17 17:39:06.329 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:39:07 compute-0 nova_compute[186479]: 2026-02-17 17:39:07.297 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:39:07 compute-0 nova_compute[186479]: 2026-02-17 17:39:07.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:39:07 compute-0 nova_compute[186479]: 2026-02-17 17:39:07.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:39:07 compute-0 nova_compute[186479]: 2026-02-17 17:39:07.302 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:39:07 compute-0 ovs-vsctl[221188]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Feb 17 17:39:07 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 220964 (sos)
Feb 17 17:39:07 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Feb 17 17:39:07 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Feb 17 17:39:08 compute-0 virtqemud[185833]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Feb 17 17:39:08 compute-0 virtqemud[185833]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Feb 17 17:39:08 compute-0 nova_compute[186479]: 2026-02-17 17:39:08.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:39:08 compute-0 nova_compute[186479]: 2026-02-17 17:39:08.305 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:39:08 compute-0 nova_compute[186479]: 2026-02-17 17:39:08.327 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:39:08 compute-0 nova_compute[186479]: 2026-02-17 17:39:08.327 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:39:08 compute-0 nova_compute[186479]: 2026-02-17 17:39:08.327 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:39:08 compute-0 nova_compute[186479]: 2026-02-17 17:39:08.328 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:39:08 compute-0 virtqemud[185833]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb 17 17:39:08 compute-0 nova_compute[186479]: 2026-02-17 17:39:08.470 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:39:08 compute-0 nova_compute[186479]: 2026-02-17 17:39:08.471 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5650MB free_disk=73.20650863647461GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:39:08 compute-0 nova_compute[186479]: 2026-02-17 17:39:08.471 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:39:08 compute-0 nova_compute[186479]: 2026-02-17 17:39:08.471 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:39:08 compute-0 nova_compute[186479]: 2026-02-17 17:39:08.729 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:39:08 compute-0 nova_compute[186479]: 2026-02-17 17:39:08.729 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:39:08 compute-0 nova_compute[186479]: 2026-02-17 17:39:08.758 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:39:08 compute-0 nova_compute[186479]: 2026-02-17 17:39:08.780 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:39:08 compute-0 nova_compute[186479]: 2026-02-17 17:39:08.782 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:39:08 compute-0 nova_compute[186479]: 2026-02-17 17:39:08.782 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.311s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:39:09 compute-0 crontab[221599]: (root) LIST (root)
Feb 17 17:39:10 compute-0 nova_compute[186479]: 2026-02-17 17:39:10.393 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:10 compute-0 nova_compute[186479]: 2026-02-17 17:39:10.450 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:10 compute-0 kernel: /proc/cgroups lists only v1 controllers, use cgroup.controllers of root cgroup for v2 info
Feb 17 17:39:10 compute-0 systemd[1]: Starting Hostname Service...
Feb 17 17:39:10 compute-0 podman[221690]: 2026-02-17 17:39:10.957066746 +0000 UTC m=+0.054234289 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 17 17:39:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:39:10.958 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:39:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:39:10.959 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:39:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:39:10.959 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:39:10 compute-0 systemd[1]: Started Hostname Service.
Feb 17 17:39:15 compute-0 nova_compute[186479]: 2026-02-17 17:39:15.394 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:15 compute-0 nova_compute[186479]: 2026-02-17 17:39:15.451 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:15 compute-0 nova_compute[186479]: 2026-02-17 17:39:15.782 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:39:15 compute-0 nova_compute[186479]: 2026-02-17 17:39:15.782 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:39:15 compute-0 nova_compute[186479]: 2026-02-17 17:39:15.783 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 17 17:39:15 compute-0 nova_compute[186479]: 2026-02-17 17:39:15.798 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 17 17:39:19 compute-0 ovs-appctl[222831]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Feb 17 17:39:19 compute-0 ovs-appctl[222835]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Feb 17 17:39:19 compute-0 ovs-appctl[222842]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Feb 17 17:39:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck2721318480-merged.mount: Deactivated successfully.
Feb 17 17:39:19 compute-0 podman[222911]: 2026-02-17 17:39:19.755195651 +0000 UTC m=+0.096701923 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 17 17:39:20 compute-0 nova_compute[186479]: 2026-02-17 17:39:20.395 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:20 compute-0 nova_compute[186479]: 2026-02-17 17:39:20.452 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:23 compute-0 podman[223902]: 2026-02-17 17:39:23.840313308 +0000 UTC m=+0.066170688 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 17 17:39:25 compute-0 nova_compute[186479]: 2026-02-17 17:39:25.398 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:25 compute-0 nova_compute[186479]: 2026-02-17 17:39:25.453 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:25 compute-0 virtqemud[185833]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb 17 17:39:26 compute-0 systemd[1]: Starting Time & Date Service...
Feb 17 17:39:26 compute-0 systemd[1]: Started Time & Date Service.
Feb 17 17:39:28 compute-0 podman[224408]: 2026-02-17 17:39:28.739082246 +0000 UTC m=+0.073931773 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.7, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 17 17:39:30 compute-0 nova_compute[186479]: 2026-02-17 17:39:30.400 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:30 compute-0 nova_compute[186479]: 2026-02-17 17:39:30.454 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:35 compute-0 nova_compute[186479]: 2026-02-17 17:39:35.446 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:35 compute-0 nova_compute[186479]: 2026-02-17 17:39:35.455 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:35 compute-0 podman[224433]: 2026-02-17 17:39:35.731376962 +0000 UTC m=+0.062365965 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Feb 17 17:39:35 compute-0 podman[224434]: 2026-02-17 17:39:35.748917715 +0000 UTC m=+0.078506804 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Feb 17 17:39:38 compute-0 sshd-session[224470]: Received disconnect from 45.148.10.157 port 29700:11:  [preauth]
Feb 17 17:39:38 compute-0 sshd-session[224470]: Disconnected from authenticating user root 45.148.10.157 port 29700 [preauth]
Feb 17 17:39:40 compute-0 nova_compute[186479]: 2026-02-17 17:39:40.448 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:40 compute-0 nova_compute[186479]: 2026-02-17 17:39:40.457 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:41 compute-0 podman[224472]: 2026-02-17 17:39:41.712508722 +0000 UTC m=+0.052868236 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 17 17:39:45 compute-0 nova_compute[186479]: 2026-02-17 17:39:45.450 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:45 compute-0 nova_compute[186479]: 2026-02-17 17:39:45.459 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:49 compute-0 sudo[220940]: pam_unix(sudo:session): session closed for user root
Feb 17 17:39:49 compute-0 sshd-session[220939]: Received disconnect from 192.168.122.10 port 45400:11: disconnected by user
Feb 17 17:39:49 compute-0 sshd-session[220939]: Disconnected from user zuul 192.168.122.10 port 45400
Feb 17 17:39:49 compute-0 sshd-session[220936]: pam_unix(sshd:session): session closed for user zuul
Feb 17 17:39:49 compute-0 systemd-logind[806]: Session 26 logged out. Waiting for processes to exit.
Feb 17 17:39:49 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Feb 17 17:39:49 compute-0 systemd[1]: session-26.scope: Consumed 1min 17.350s CPU time, 631.8M memory peak, read 246.2M from disk, written 22.8M to disk.
Feb 17 17:39:49 compute-0 systemd-logind[806]: Removed session 26.
Feb 17 17:39:49 compute-0 sshd-session[224497]: Accepted publickey for zuul from 192.168.122.10 port 33346 ssh2: ECDSA SHA256:+U7vSwQZvFvLlKGEfHWMlLBUV1LwgLTIxJVbiTi7h3A
Feb 17 17:39:49 compute-0 systemd-logind[806]: New session 27 of user zuul.
Feb 17 17:39:49 compute-0 systemd[1]: Started Session 27 of User zuul.
Feb 17 17:39:49 compute-0 sshd-session[224497]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 17:39:49 compute-0 sudo[224501]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2026-02-17-knwtqkl.tar.xz
Feb 17 17:39:49 compute-0 sudo[224501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:39:49 compute-0 sudo[224501]: pam_unix(sudo:session): session closed for user root
Feb 17 17:39:49 compute-0 sshd-session[224500]: Received disconnect from 192.168.122.10 port 33346:11: disconnected by user
Feb 17 17:39:49 compute-0 sshd-session[224500]: Disconnected from user zuul 192.168.122.10 port 33346
Feb 17 17:39:49 compute-0 sshd-session[224497]: pam_unix(sshd:session): session closed for user zuul
Feb 17 17:39:49 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Feb 17 17:39:49 compute-0 systemd-logind[806]: Session 27 logged out. Waiting for processes to exit.
Feb 17 17:39:49 compute-0 systemd-logind[806]: Removed session 27.
Feb 17 17:39:49 compute-0 sshd-session[224526]: Accepted publickey for zuul from 192.168.122.10 port 33362 ssh2: ECDSA SHA256:+U7vSwQZvFvLlKGEfHWMlLBUV1LwgLTIxJVbiTi7h3A
Feb 17 17:39:49 compute-0 systemd-logind[806]: New session 28 of user zuul.
Feb 17 17:39:49 compute-0 systemd[1]: Started Session 28 of User zuul.
Feb 17 17:39:49 compute-0 sshd-session[224526]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 17:39:49 compute-0 sudo[224546]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Feb 17 17:39:49 compute-0 sudo[224546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:39:49 compute-0 sudo[224546]: pam_unix(sudo:session): session closed for user root
Feb 17 17:39:49 compute-0 sshd-session[224530]: Received disconnect from 192.168.122.10 port 33362:11: disconnected by user
Feb 17 17:39:49 compute-0 sshd-session[224530]: Disconnected from user zuul 192.168.122.10 port 33362
Feb 17 17:39:49 compute-0 sshd-session[224526]: pam_unix(sshd:session): session closed for user zuul
Feb 17 17:39:49 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Feb 17 17:39:49 compute-0 systemd-logind[806]: Session 28 logged out. Waiting for processes to exit.
Feb 17 17:39:49 compute-0 systemd-logind[806]: Removed session 28.
Feb 17 17:39:49 compute-0 podman[224529]: 2026-02-17 17:39:49.911886167 +0000 UTC m=+0.121087761 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 17 17:39:50 compute-0 nova_compute[186479]: 2026-02-17 17:39:50.452 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:50 compute-0 nova_compute[186479]: 2026-02-17 17:39:50.461 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:54 compute-0 podman[224585]: 2026-02-17 17:39:54.706694879 +0000 UTC m=+0.050930690 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 17 17:39:55 compute-0 nova_compute[186479]: 2026-02-17 17:39:55.456 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:55 compute-0 nova_compute[186479]: 2026-02-17 17:39:55.462 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:39:56 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 17 17:39:56 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 17 17:39:59 compute-0 podman[224616]: 2026-02-17 17:39:59.7233409 +0000 UTC m=+0.062328655 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.buildah.version=1.33.7, distribution-scope=public, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., vcs-type=git)
Feb 17 17:40:00 compute-0 nova_compute[186479]: 2026-02-17 17:40:00.456 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:40:00 compute-0 nova_compute[186479]: 2026-02-17 17:40:00.462 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:40:04 compute-0 nova_compute[186479]: 2026-02-17 17:40:04.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:40:05 compute-0 nova_compute[186479]: 2026-02-17 17:40:05.459 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:40:05 compute-0 nova_compute[186479]: 2026-02-17 17:40:05.463 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:40:06 compute-0 podman[224637]: 2026-02-17 17:40:06.720287167 +0000 UTC m=+0.060901989 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 17 17:40:06 compute-0 podman[224638]: 2026-02-17 17:40:06.754369699 +0000 UTC m=+0.089917770 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 17 17:40:07 compute-0 nova_compute[186479]: 2026-02-17 17:40:07.298 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:40:07 compute-0 nova_compute[186479]: 2026-02-17 17:40:07.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.303 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.339 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.340 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.340 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.340 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.486 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.487 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5622MB free_disk=73.20638656616211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.487 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.487 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.584 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.584 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.645 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Refreshing inventories for resource provider c9b7a021-c13f-4158-9f46-47cefef2fece _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.794 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Updating ProviderTree inventory for provider c9b7a021-c13f-4158-9f46-47cefef2fece from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.794 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Updating inventory in ProviderTree for provider c9b7a021-c13f-4158-9f46-47cefef2fece with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.816 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Refreshing aggregate associations for resource provider c9b7a021-c13f-4158-9f46-47cefef2fece, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.854 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Refreshing trait associations for resource provider c9b7a021-c13f-4158-9f46-47cefef2fece, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SHA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_BMI,HW_CPU_X86_ABM,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_BMI2,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NODE,HW_CPU_X86_SVM,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,HW_CPU_X86_AVX,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_FMA3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_RESCUE_BFV,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE42,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.880 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.901 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.903 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:40:08 compute-0 nova_compute[186479]: 2026-02-17 17:40:08.904 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.416s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:40:10 compute-0 nova_compute[186479]: 2026-02-17 17:40:10.462 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:40:10 compute-0 nova_compute[186479]: 2026-02-17 17:40:10.464 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:40:10 compute-0 nova_compute[186479]: 2026-02-17 17:40:10.899 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:40:10 compute-0 nova_compute[186479]: 2026-02-17 17:40:10.920 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:40:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:40:10.961 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:40:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:40:10.962 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:40:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:40:10.962 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:40:12 compute-0 podman[224674]: 2026-02-17 17:40:12.712509175 +0000 UTC m=+0.049830793 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 17 17:40:15 compute-0 nova_compute[186479]: 2026-02-17 17:40:15.464 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:40:17 compute-0 nova_compute[186479]: 2026-02-17 17:40:17.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:40:17 compute-0 nova_compute[186479]: 2026-02-17 17:40:17.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:40:17 compute-0 nova_compute[186479]: 2026-02-17 17:40:17.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 17 17:40:17 compute-0 nova_compute[186479]: 2026-02-17 17:40:17.336 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 17 17:40:20 compute-0 nova_compute[186479]: 2026-02-17 17:40:20.466 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:40:20 compute-0 podman[224699]: 2026-02-17 17:40:20.76012628 +0000 UTC m=+0.099816838 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Feb 17 17:40:25 compute-0 nova_compute[186479]: 2026-02-17 17:40:25.467 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:40:25 compute-0 podman[224725]: 2026-02-17 17:40:25.70410215 +0000 UTC m=+0.048304446 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 17 17:40:30 compute-0 nova_compute[186479]: 2026-02-17 17:40:30.469 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:40:30 compute-0 podman[224749]: 2026-02-17 17:40:30.557553044 +0000 UTC m=+0.054898814 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, vendor=Red Hat, Inc., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public)
Feb 17 17:40:35 compute-0 nova_compute[186479]: 2026-02-17 17:40:35.471 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:40:35 compute-0 nova_compute[186479]: 2026-02-17 17:40:35.472 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:40:35 compute-0 nova_compute[186479]: 2026-02-17 17:40:35.472 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:40:35 compute-0 nova_compute[186479]: 2026-02-17 17:40:35.472 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:40:35 compute-0 nova_compute[186479]: 2026-02-17 17:40:35.473 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:40:35 compute-0 nova_compute[186479]: 2026-02-17 17:40:35.474 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:40:37 compute-0 podman[224771]: 2026-02-17 17:40:37.711186611 +0000 UTC m=+0.048915401 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute)
Feb 17 17:40:37 compute-0 podman[224770]: 2026-02-17 17:40:37.727857793 +0000 UTC m=+0.069992919 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent)
Feb 17 17:40:40 compute-0 nova_compute[186479]: 2026-02-17 17:40:40.474 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:40:43 compute-0 podman[224808]: 2026-02-17 17:40:43.704694079 +0000 UTC m=+0.048434159 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:40:43.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:40:45 compute-0 nova_compute[186479]: 2026-02-17 17:40:45.475 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:40:50 compute-0 nova_compute[186479]: 2026-02-17 17:40:50.475 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:40:50 compute-0 nova_compute[186479]: 2026-02-17 17:40:50.478 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:40:51 compute-0 podman[224834]: 2026-02-17 17:40:51.739213329 +0000 UTC m=+0.078488924 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 17 17:40:55 compute-0 nova_compute[186479]: 2026-02-17 17:40:55.477 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:40:55 compute-0 nova_compute[186479]: 2026-02-17 17:40:55.479 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:40:56 compute-0 podman[224860]: 2026-02-17 17:40:56.72142464 +0000 UTC m=+0.062377136 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 17 17:41:00 compute-0 nova_compute[186479]: 2026-02-17 17:41:00.480 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:41:00 compute-0 nova_compute[186479]: 2026-02-17 17:41:00.482 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:41:00 compute-0 nova_compute[186479]: 2026-02-17 17:41:00.482 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:41:00 compute-0 nova_compute[186479]: 2026-02-17 17:41:00.482 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:41:00 compute-0 nova_compute[186479]: 2026-02-17 17:41:00.522 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:41:00 compute-0 nova_compute[186479]: 2026-02-17 17:41:00.523 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:41:00 compute-0 podman[224884]: 2026-02-17 17:41:00.708839939 +0000 UTC m=+0.053157252 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1770267347, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Feb 17 17:41:04 compute-0 nova_compute[186479]: 2026-02-17 17:41:04.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:41:05 compute-0 nova_compute[186479]: 2026-02-17 17:41:05.523 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:41:07 compute-0 nova_compute[186479]: 2026-02-17 17:41:07.298 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:41:07 compute-0 nova_compute[186479]: 2026-02-17 17:41:07.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:41:08 compute-0 podman[224905]: 2026-02-17 17:41:08.743979302 +0000 UTC m=+0.084218451 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 17 17:41:08 compute-0 podman[224906]: 2026-02-17 17:41:08.744865404 +0000 UTC m=+0.084254093 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 17 17:41:09 compute-0 nova_compute[186479]: 2026-02-17 17:41:09.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:41:09 compute-0 nova_compute[186479]: 2026-02-17 17:41:09.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:41:09 compute-0 nova_compute[186479]: 2026-02-17 17:41:09.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:41:09 compute-0 nova_compute[186479]: 2026-02-17 17:41:09.303 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:41:09 compute-0 nova_compute[186479]: 2026-02-17 17:41:09.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:41:09 compute-0 nova_compute[186479]: 2026-02-17 17:41:09.334 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:41:09 compute-0 nova_compute[186479]: 2026-02-17 17:41:09.334 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:41:09 compute-0 nova_compute[186479]: 2026-02-17 17:41:09.334 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:41:09 compute-0 nova_compute[186479]: 2026-02-17 17:41:09.334 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:41:09 compute-0 nova_compute[186479]: 2026-02-17 17:41:09.442 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:41:09 compute-0 nova_compute[186479]: 2026-02-17 17:41:09.443 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5705MB free_disk=73.20640563964844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:41:09 compute-0 nova_compute[186479]: 2026-02-17 17:41:09.443 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:41:09 compute-0 nova_compute[186479]: 2026-02-17 17:41:09.443 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:41:09 compute-0 nova_compute[186479]: 2026-02-17 17:41:09.496 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:41:09 compute-0 nova_compute[186479]: 2026-02-17 17:41:09.496 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:41:09 compute-0 nova_compute[186479]: 2026-02-17 17:41:09.527 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:41:09 compute-0 nova_compute[186479]: 2026-02-17 17:41:09.543 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:41:09 compute-0 nova_compute[186479]: 2026-02-17 17:41:09.545 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:41:09 compute-0 nova_compute[186479]: 2026-02-17 17:41:09.545 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:41:10 compute-0 nova_compute[186479]: 2026-02-17 17:41:10.525 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:41:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:41:10.963 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:41:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:41:10.963 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:41:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:41:10.964 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:41:11 compute-0 nova_compute[186479]: 2026-02-17 17:41:11.546 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:41:14 compute-0 podman[224945]: 2026-02-17 17:41:14.750808793 +0000 UTC m=+0.044804702 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 17 17:41:15 compute-0 nova_compute[186479]: 2026-02-17 17:41:15.526 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:41:18 compute-0 nova_compute[186479]: 2026-02-17 17:41:18.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:41:18 compute-0 nova_compute[186479]: 2026-02-17 17:41:18.303 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:41:18 compute-0 nova_compute[186479]: 2026-02-17 17:41:18.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 17 17:41:18 compute-0 nova_compute[186479]: 2026-02-17 17:41:18.334 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 17 17:41:20 compute-0 nova_compute[186479]: 2026-02-17 17:41:20.528 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:41:20 compute-0 nova_compute[186479]: 2026-02-17 17:41:20.530 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:41:20 compute-0 nova_compute[186479]: 2026-02-17 17:41:20.530 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:41:20 compute-0 nova_compute[186479]: 2026-02-17 17:41:20.530 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:41:20 compute-0 nova_compute[186479]: 2026-02-17 17:41:20.566 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:41:20 compute-0 nova_compute[186479]: 2026-02-17 17:41:20.567 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:41:22 compute-0 podman[224970]: 2026-02-17 17:41:22.795379771 +0000 UTC m=+0.131862568 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 17 17:41:25 compute-0 nova_compute[186479]: 2026-02-17 17:41:25.567 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:41:25 compute-0 nova_compute[186479]: 2026-02-17 17:41:25.570 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:41:25 compute-0 nova_compute[186479]: 2026-02-17 17:41:25.570 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:41:25 compute-0 nova_compute[186479]: 2026-02-17 17:41:25.570 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:41:25 compute-0 nova_compute[186479]: 2026-02-17 17:41:25.600 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:41:25 compute-0 nova_compute[186479]: 2026-02-17 17:41:25.601 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:41:27 compute-0 podman[224998]: 2026-02-17 17:41:27.69821834 +0000 UTC m=+0.045324004 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 17 17:41:30 compute-0 nova_compute[186479]: 2026-02-17 17:41:30.601 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:41:30 compute-0 nova_compute[186479]: 2026-02-17 17:41:30.603 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:41:31 compute-0 podman[225023]: 2026-02-17 17:41:31.706320618 +0000 UTC m=+0.042140613 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1770267347, version=9.7, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7)
Feb 17 17:41:35 compute-0 nova_compute[186479]: 2026-02-17 17:41:35.603 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:41:35 compute-0 nova_compute[186479]: 2026-02-17 17:41:35.604 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:41:35 compute-0 nova_compute[186479]: 2026-02-17 17:41:35.604 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:41:35 compute-0 nova_compute[186479]: 2026-02-17 17:41:35.604 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:41:35 compute-0 nova_compute[186479]: 2026-02-17 17:41:35.604 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:41:35 compute-0 nova_compute[186479]: 2026-02-17 17:41:35.605 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:41:39 compute-0 podman[225044]: 2026-02-17 17:41:39.699810185 +0000 UTC m=+0.045653941 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 17 17:41:39 compute-0 podman[225045]: 2026-02-17 17:41:39.732112967 +0000 UTC m=+0.075402515 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Feb 17 17:41:40 compute-0 nova_compute[186479]: 2026-02-17 17:41:40.606 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:41:40 compute-0 nova_compute[186479]: 2026-02-17 17:41:40.607 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:41:40 compute-0 nova_compute[186479]: 2026-02-17 17:41:40.608 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:41:40 compute-0 nova_compute[186479]: 2026-02-17 17:41:40.608 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:41:40 compute-0 nova_compute[186479]: 2026-02-17 17:41:40.608 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:41:45 compute-0 nova_compute[186479]: 2026-02-17 17:41:45.610 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:41:45 compute-0 nova_compute[186479]: 2026-02-17 17:41:45.612 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:41:45 compute-0 nova_compute[186479]: 2026-02-17 17:41:45.612 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:41:45 compute-0 nova_compute[186479]: 2026-02-17 17:41:45.612 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:41:45 compute-0 nova_compute[186479]: 2026-02-17 17:41:45.660 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:41:45 compute-0 nova_compute[186479]: 2026-02-17 17:41:45.660 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:41:45 compute-0 podman[225083]: 2026-02-17 17:41:45.72482585 +0000 UTC m=+0.047859141 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 17 17:41:50 compute-0 nova_compute[186479]: 2026-02-17 17:41:50.662 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:41:50 compute-0 nova_compute[186479]: 2026-02-17 17:41:50.663 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:41:50 compute-0 nova_compute[186479]: 2026-02-17 17:41:50.663 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:41:50 compute-0 nova_compute[186479]: 2026-02-17 17:41:50.664 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:41:50 compute-0 nova_compute[186479]: 2026-02-17 17:41:50.664 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:41:50 compute-0 nova_compute[186479]: 2026-02-17 17:41:50.666 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:41:53 compute-0 podman[225108]: 2026-02-17 17:41:53.756816787 +0000 UTC m=+0.100263771 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 17 17:41:55 compute-0 nova_compute[186479]: 2026-02-17 17:41:55.663 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:41:58 compute-0 podman[225135]: 2026-02-17 17:41:58.726749596 +0000 UTC m=+0.072840669 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 17 17:42:00 compute-0 nova_compute[186479]: 2026-02-17 17:42:00.666 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:42:02 compute-0 podman[225159]: 2026-02-17 17:42:02.717447843 +0000 UTC m=+0.054087070 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1770267347, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, version=9.7, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter)
Feb 17 17:42:04 compute-0 nova_compute[186479]: 2026-02-17 17:42:04.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:42:05 compute-0 nova_compute[186479]: 2026-02-17 17:42:05.670 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:42:05 compute-0 nova_compute[186479]: 2026-02-17 17:42:05.671 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:42:05 compute-0 nova_compute[186479]: 2026-02-17 17:42:05.671 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:42:05 compute-0 nova_compute[186479]: 2026-02-17 17:42:05.671 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:42:05 compute-0 nova_compute[186479]: 2026-02-17 17:42:05.727 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:42:05 compute-0 nova_compute[186479]: 2026-02-17 17:42:05.727 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:42:07 compute-0 nova_compute[186479]: 2026-02-17 17:42:07.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:42:09 compute-0 nova_compute[186479]: 2026-02-17 17:42:09.298 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:42:10 compute-0 nova_compute[186479]: 2026-02-17 17:42:10.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:42:10 compute-0 nova_compute[186479]: 2026-02-17 17:42:10.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:42:10 compute-0 nova_compute[186479]: 2026-02-17 17:42:10.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:42:10 compute-0 nova_compute[186479]: 2026-02-17 17:42:10.327 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:42:10 compute-0 nova_compute[186479]: 2026-02-17 17:42:10.327 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:42:10 compute-0 nova_compute[186479]: 2026-02-17 17:42:10.328 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:42:10 compute-0 nova_compute[186479]: 2026-02-17 17:42:10.328 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:42:10 compute-0 nova_compute[186479]: 2026-02-17 17:42:10.455 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:42:10 compute-0 nova_compute[186479]: 2026-02-17 17:42:10.456 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5721MB free_disk=73.20578002929688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:42:10 compute-0 nova_compute[186479]: 2026-02-17 17:42:10.456 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:42:10 compute-0 nova_compute[186479]: 2026-02-17 17:42:10.456 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:42:10 compute-0 nova_compute[186479]: 2026-02-17 17:42:10.513 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:42:10 compute-0 nova_compute[186479]: 2026-02-17 17:42:10.514 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:42:10 compute-0 nova_compute[186479]: 2026-02-17 17:42:10.543 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:42:10 compute-0 nova_compute[186479]: 2026-02-17 17:42:10.566 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:42:10 compute-0 nova_compute[186479]: 2026-02-17 17:42:10.568 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:42:10 compute-0 nova_compute[186479]: 2026-02-17 17:42:10.568 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:42:10 compute-0 podman[225180]: 2026-02-17 17:42:10.708967456 +0000 UTC m=+0.046471400 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127)
Feb 17 17:42:10 compute-0 nova_compute[186479]: 2026-02-17 17:42:10.727 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:42:10 compute-0 podman[225181]: 2026-02-17 17:42:10.739781835 +0000 UTC m=+0.070212930 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 17 17:42:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:42:10.964 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:42:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:42:10.964 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:42:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:42:10.965 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:42:11 compute-0 nova_compute[186479]: 2026-02-17 17:42:11.569 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:42:11 compute-0 nova_compute[186479]: 2026-02-17 17:42:11.569 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:42:12 compute-0 nova_compute[186479]: 2026-02-17 17:42:12.298 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:42:13 compute-0 nova_compute[186479]: 2026-02-17 17:42:13.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:42:15 compute-0 nova_compute[186479]: 2026-02-17 17:42:15.729 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:42:16 compute-0 podman[225221]: 2026-02-17 17:42:16.718220818 +0000 UTC m=+0.058329345 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 17 17:42:19 compute-0 nova_compute[186479]: 2026-02-17 17:42:19.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:42:19 compute-0 nova_compute[186479]: 2026-02-17 17:42:19.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:42:19 compute-0 nova_compute[186479]: 2026-02-17 17:42:19.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 17 17:42:19 compute-0 nova_compute[186479]: 2026-02-17 17:42:19.332 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 17 17:42:20 compute-0 nova_compute[186479]: 2026-02-17 17:42:20.731 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:42:20 compute-0 nova_compute[186479]: 2026-02-17 17:42:20.733 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:42:20 compute-0 nova_compute[186479]: 2026-02-17 17:42:20.733 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:42:20 compute-0 nova_compute[186479]: 2026-02-17 17:42:20.733 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:42:20 compute-0 nova_compute[186479]: 2026-02-17 17:42:20.784 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:42:20 compute-0 nova_compute[186479]: 2026-02-17 17:42:20.785 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:42:24 compute-0 podman[225245]: 2026-02-17 17:42:24.770591081 +0000 UTC m=+0.106241065 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 17 17:42:25 compute-0 nova_compute[186479]: 2026-02-17 17:42:25.786 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:42:25 compute-0 nova_compute[186479]: 2026-02-17 17:42:25.788 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:42:25 compute-0 nova_compute[186479]: 2026-02-17 17:42:25.788 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:42:25 compute-0 nova_compute[186479]: 2026-02-17 17:42:25.788 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:42:25 compute-0 nova_compute[186479]: 2026-02-17 17:42:25.836 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:42:25 compute-0 nova_compute[186479]: 2026-02-17 17:42:25.837 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:42:29 compute-0 podman[225271]: 2026-02-17 17:42:29.746024781 +0000 UTC m=+0.090217506 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 17 17:42:30 compute-0 nova_compute[186479]: 2026-02-17 17:42:30.838 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:42:30 compute-0 nova_compute[186479]: 2026-02-17 17:42:30.839 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:42:30 compute-0 nova_compute[186479]: 2026-02-17 17:42:30.840 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:42:30 compute-0 nova_compute[186479]: 2026-02-17 17:42:30.840 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:42:30 compute-0 nova_compute[186479]: 2026-02-17 17:42:30.888 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:42:30 compute-0 nova_compute[186479]: 2026-02-17 17:42:30.889 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:42:33 compute-0 podman[225295]: 2026-02-17 17:42:33.699831806 +0000 UTC m=+0.045661061 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, version=9.7)
Feb 17 17:42:35 compute-0 nova_compute[186479]: 2026-02-17 17:42:35.889 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:42:40 compute-0 nova_compute[186479]: 2026-02-17 17:42:40.891 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:42:40 compute-0 nova_compute[186479]: 2026-02-17 17:42:40.892 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:42:40 compute-0 nova_compute[186479]: 2026-02-17 17:42:40.892 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:42:40 compute-0 nova_compute[186479]: 2026-02-17 17:42:40.893 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:42:40 compute-0 nova_compute[186479]: 2026-02-17 17:42:40.893 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:42:40 compute-0 nova_compute[186479]: 2026-02-17 17:42:40.894 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:42:41 compute-0 podman[225316]: 2026-02-17 17:42:41.707792647 +0000 UTC m=+0.053683901 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:42:41 compute-0 podman[225317]: 2026-02-17 17:42:41.726857043 +0000 UTC m=+0.062284792 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute)
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.723 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.723 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.723 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.723 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.723 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.723 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.723 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.724 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.724 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.724 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.724 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.724 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.724 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.724 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.725 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.725 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.725 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:42:43.725 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:42:45 compute-0 nova_compute[186479]: 2026-02-17 17:42:45.893 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:42:45 compute-0 nova_compute[186479]: 2026-02-17 17:42:45.894 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:42:46 compute-0 nova_compute[186479]: 2026-02-17 17:42:46.897 186483 DEBUG oslo_concurrency.processutils [None req-b17d33f8-c40a-4692-ae48-cfa2fc297bc8 796d714b69a84bb693e13bfee74071c4 02ed9754ecd847a6a89524591c01aa73 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 17 17:42:46 compute-0 nova_compute[186479]: 2026-02-17 17:42:46.923 186483 DEBUG oslo_concurrency.processutils [None req-b17d33f8-c40a-4692-ae48-cfa2fc297bc8 796d714b69a84bb693e13bfee74071c4 02ed9754ecd847a6a89524591c01aa73 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 17 17:42:47 compute-0 podman[225358]: 2026-02-17 17:42:47.715012982 +0000 UTC m=+0.055439540 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 17 17:42:50 compute-0 nova_compute[186479]: 2026-02-17 17:42:50.895 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:42:50 compute-0 nova_compute[186479]: 2026-02-17 17:42:50.897 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:42:50 compute-0 nova_compute[186479]: 2026-02-17 17:42:50.899 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:42:50 compute-0 nova_compute[186479]: 2026-02-17 17:42:50.899 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:42:50 compute-0 nova_compute[186479]: 2026-02-17 17:42:50.900 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:42:50 compute-0 nova_compute[186479]: 2026-02-17 17:42:50.902 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:42:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:42:52.600 105898 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '7a:bf:46', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '42:a1:c8:7d:f2:63'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 17 17:42:52 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:42:52.601 105898 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 17 17:42:52 compute-0 nova_compute[186479]: 2026-02-17 17:42:52.602 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:42:53 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:42:53.606 105898 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ee0cee2f-3200-4f1f-8903-57b18789347d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 17 17:42:55 compute-0 podman[225382]: 2026-02-17 17:42:55.803229287 +0000 UTC m=+0.130633850 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 17 17:42:55 compute-0 nova_compute[186479]: 2026-02-17 17:42:55.900 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:42:55 compute-0 nova_compute[186479]: 2026-02-17 17:42:55.903 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:43:00 compute-0 podman[225409]: 2026-02-17 17:43:00.720037958 +0000 UTC m=+0.066471286 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 17 17:43:00 compute-0 nova_compute[186479]: 2026-02-17 17:43:00.903 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:43:04 compute-0 podman[225434]: 2026-02-17 17:43:04.716142487 +0000 UTC m=+0.051727097 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, release=1770267347, architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 17 17:43:05 compute-0 nova_compute[186479]: 2026-02-17 17:43:05.905 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:43:06 compute-0 nova_compute[186479]: 2026-02-17 17:43:06.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:43:08 compute-0 nova_compute[186479]: 2026-02-17 17:43:08.305 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:43:10 compute-0 nova_compute[186479]: 2026-02-17 17:43:10.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:43:10 compute-0 nova_compute[186479]: 2026-02-17 17:43:10.336 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:43:10 compute-0 nova_compute[186479]: 2026-02-17 17:43:10.336 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:43:10 compute-0 nova_compute[186479]: 2026-02-17 17:43:10.337 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:43:10 compute-0 nova_compute[186479]: 2026-02-17 17:43:10.337 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:43:10 compute-0 nova_compute[186479]: 2026-02-17 17:43:10.506 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:43:10 compute-0 nova_compute[186479]: 2026-02-17 17:43:10.507 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5744MB free_disk=73.20578002929688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:43:10 compute-0 nova_compute[186479]: 2026-02-17 17:43:10.507 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:43:10 compute-0 nova_compute[186479]: 2026-02-17 17:43:10.508 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:43:10 compute-0 nova_compute[186479]: 2026-02-17 17:43:10.593 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:43:10 compute-0 nova_compute[186479]: 2026-02-17 17:43:10.593 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:43:10 compute-0 nova_compute[186479]: 2026-02-17 17:43:10.622 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:43:10 compute-0 nova_compute[186479]: 2026-02-17 17:43:10.636 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:43:10 compute-0 nova_compute[186479]: 2026-02-17 17:43:10.637 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:43:10 compute-0 nova_compute[186479]: 2026-02-17 17:43:10.637 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:43:10 compute-0 nova_compute[186479]: 2026-02-17 17:43:10.908 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:43:10 compute-0 nova_compute[186479]: 2026-02-17 17:43:10.909 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:43:10 compute-0 nova_compute[186479]: 2026-02-17 17:43:10.910 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:43:10 compute-0 nova_compute[186479]: 2026-02-17 17:43:10.910 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:43:10 compute-0 nova_compute[186479]: 2026-02-17 17:43:10.962 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:43:10 compute-0 nova_compute[186479]: 2026-02-17 17:43:10.963 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:43:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:43:10.964 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:43:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:43:10.964 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:43:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:43:10.965 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:43:11 compute-0 nova_compute[186479]: 2026-02-17 17:43:11.631 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:43:11 compute-0 nova_compute[186479]: 2026-02-17 17:43:11.632 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:43:12 compute-0 nova_compute[186479]: 2026-02-17 17:43:12.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:43:12 compute-0 podman[225458]: 2026-02-17 17:43:12.704041907 +0000 UTC m=+0.046057046 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:43:12 compute-0 podman[225457]: 2026-02-17 17:43:12.723856208 +0000 UTC m=+0.068851901 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 17 17:43:13 compute-0 nova_compute[186479]: 2026-02-17 17:43:13.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:43:13 compute-0 nova_compute[186479]: 2026-02-17 17:43:13.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:43:15 compute-0 nova_compute[186479]: 2026-02-17 17:43:15.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:43:15 compute-0 nova_compute[186479]: 2026-02-17 17:43:15.964 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:43:18 compute-0 podman[225495]: 2026-02-17 17:43:18.726512112 +0000 UTC m=+0.070362536 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 17 17:43:20 compute-0 nova_compute[186479]: 2026-02-17 17:43:20.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:43:20 compute-0 nova_compute[186479]: 2026-02-17 17:43:20.303 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:43:20 compute-0 nova_compute[186479]: 2026-02-17 17:43:20.303 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 17 17:43:20 compute-0 nova_compute[186479]: 2026-02-17 17:43:20.322 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 17 17:43:20 compute-0 nova_compute[186479]: 2026-02-17 17:43:20.966 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:43:20 compute-0 nova_compute[186479]: 2026-02-17 17:43:20.966 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:43:20 compute-0 nova_compute[186479]: 2026-02-17 17:43:20.966 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:43:20 compute-0 nova_compute[186479]: 2026-02-17 17:43:20.966 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:43:20 compute-0 nova_compute[186479]: 2026-02-17 17:43:20.967 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:43:20 compute-0 nova_compute[186479]: 2026-02-17 17:43:20.967 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:43:25 compute-0 nova_compute[186479]: 2026-02-17 17:43:25.970 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:43:26 compute-0 podman[225520]: 2026-02-17 17:43:26.744947264 +0000 UTC m=+0.083089471 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 17 17:43:30 compute-0 nova_compute[186479]: 2026-02-17 17:43:30.971 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:43:30 compute-0 nova_compute[186479]: 2026-02-17 17:43:30.972 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:43:31 compute-0 podman[225547]: 2026-02-17 17:43:31.705316046 +0000 UTC m=+0.048999446 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 17 17:43:35 compute-0 podman[225572]: 2026-02-17 17:43:35.71143118 +0000 UTC m=+0.054819917 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, architecture=x86_64, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, release=1770267347, vcs-type=git, io.buildah.version=1.33.7, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9)
Feb 17 17:43:35 compute-0 nova_compute[186479]: 2026-02-17 17:43:35.973 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:43:35 compute-0 nova_compute[186479]: 2026-02-17 17:43:35.974 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:43:35 compute-0 nova_compute[186479]: 2026-02-17 17:43:35.974 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:43:35 compute-0 nova_compute[186479]: 2026-02-17 17:43:35.974 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:43:35 compute-0 nova_compute[186479]: 2026-02-17 17:43:35.974 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:43:35 compute-0 nova_compute[186479]: 2026-02-17 17:43:35.975 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:43:40 compute-0 nova_compute[186479]: 2026-02-17 17:43:40.976 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:43:43 compute-0 podman[225594]: 2026-02-17 17:43:43.716824792 +0000 UTC m=+0.051934677 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 17 17:43:43 compute-0 podman[225593]: 2026-02-17 17:43:43.738868615 +0000 UTC m=+0.076219254 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 17 17:43:45 compute-0 nova_compute[186479]: 2026-02-17 17:43:45.977 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:43:45 compute-0 nova_compute[186479]: 2026-02-17 17:43:45.978 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:43:49 compute-0 podman[225630]: 2026-02-17 17:43:49.69780755 +0000 UTC m=+0.044238652 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 17 17:43:50 compute-0 nova_compute[186479]: 2026-02-17 17:43:50.979 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:43:50 compute-0 nova_compute[186479]: 2026-02-17 17:43:50.980 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:43:55 compute-0 nova_compute[186479]: 2026-02-17 17:43:55.980 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:43:55 compute-0 nova_compute[186479]: 2026-02-17 17:43:55.982 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:43:57 compute-0 podman[225654]: 2026-02-17 17:43:57.730620044 +0000 UTC m=+0.076809300 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 17 17:44:00 compute-0 nova_compute[186479]: 2026-02-17 17:44:00.981 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:44:00 compute-0 nova_compute[186479]: 2026-02-17 17:44:00.982 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:44:02 compute-0 podman[225681]: 2026-02-17 17:44:02.700349444 +0000 UTC m=+0.044350945 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 17 17:44:04 compute-0 nova_compute[186479]: 2026-02-17 17:44:04.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:44:05 compute-0 nova_compute[186479]: 2026-02-17 17:44:05.983 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:44:05 compute-0 nova_compute[186479]: 2026-02-17 17:44:05.984 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:44:05 compute-0 nova_compute[186479]: 2026-02-17 17:44:05.984 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:44:05 compute-0 nova_compute[186479]: 2026-02-17 17:44:05.984 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:44:05 compute-0 nova_compute[186479]: 2026-02-17 17:44:05.985 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:44:05 compute-0 nova_compute[186479]: 2026-02-17 17:44:05.985 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:44:06 compute-0 podman[225705]: 2026-02-17 17:44:06.72603374 +0000 UTC m=+0.068320655 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, name=ubi9/ubi-minimal, release=1770267347, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., version=9.7, architecture=x86_64, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=)
Feb 17 17:44:08 compute-0 nova_compute[186479]: 2026-02-17 17:44:08.316 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:44:09 compute-0 nova_compute[186479]: 2026-02-17 17:44:09.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:44:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:44:10.964 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:44:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:44:10.965 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:44:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:44:10.965 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:44:10 compute-0 nova_compute[186479]: 2026-02-17 17:44:10.986 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:44:11 compute-0 nova_compute[186479]: 2026-02-17 17:44:11.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:44:11 compute-0 nova_compute[186479]: 2026-02-17 17:44:11.329 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:44:11 compute-0 nova_compute[186479]: 2026-02-17 17:44:11.329 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:44:11 compute-0 nova_compute[186479]: 2026-02-17 17:44:11.329 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:44:11 compute-0 nova_compute[186479]: 2026-02-17 17:44:11.329 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:44:11 compute-0 nova_compute[186479]: 2026-02-17 17:44:11.474 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:44:11 compute-0 nova_compute[186479]: 2026-02-17 17:44:11.475 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5745MB free_disk=73.20578002929688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:44:11 compute-0 nova_compute[186479]: 2026-02-17 17:44:11.475 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:44:11 compute-0 nova_compute[186479]: 2026-02-17 17:44:11.476 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:44:11 compute-0 nova_compute[186479]: 2026-02-17 17:44:11.547 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:44:11 compute-0 nova_compute[186479]: 2026-02-17 17:44:11.547 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:44:11 compute-0 nova_compute[186479]: 2026-02-17 17:44:11.677 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:44:11 compute-0 nova_compute[186479]: 2026-02-17 17:44:11.690 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:44:11 compute-0 nova_compute[186479]: 2026-02-17 17:44:11.691 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:44:11 compute-0 nova_compute[186479]: 2026-02-17 17:44:11.692 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.216s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:44:11 compute-0 nova_compute[186479]: 2026-02-17 17:44:11.692 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:44:11 compute-0 nova_compute[186479]: 2026-02-17 17:44:11.692 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 17 17:44:13 compute-0 nova_compute[186479]: 2026-02-17 17:44:13.698 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:44:13 compute-0 nova_compute[186479]: 2026-02-17 17:44:13.699 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:44:13 compute-0 nova_compute[186479]: 2026-02-17 17:44:13.699 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:44:14 compute-0 nova_compute[186479]: 2026-02-17 17:44:14.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:44:14 compute-0 nova_compute[186479]: 2026-02-17 17:44:14.305 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 17 17:44:14 compute-0 nova_compute[186479]: 2026-02-17 17:44:14.326 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 17 17:44:14 compute-0 podman[225726]: 2026-02-17 17:44:14.729316131 +0000 UTC m=+0.064844160 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 17 17:44:14 compute-0 podman[225727]: 2026-02-17 17:44:14.729721131 +0000 UTC m=+0.065983817 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Feb 17 17:44:15 compute-0 nova_compute[186479]: 2026-02-17 17:44:15.326 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:44:15 compute-0 nova_compute[186479]: 2026-02-17 17:44:15.327 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:44:15 compute-0 nova_compute[186479]: 2026-02-17 17:44:15.327 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:44:15 compute-0 nova_compute[186479]: 2026-02-17 17:44:15.989 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:44:16 compute-0 nova_compute[186479]: 2026-02-17 17:44:16.298 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:44:20 compute-0 podman[225764]: 2026-02-17 17:44:20.703929164 +0000 UTC m=+0.049816426 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 17 17:44:20 compute-0 nova_compute[186479]: 2026-02-17 17:44:20.990 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:44:20 compute-0 nova_compute[186479]: 2026-02-17 17:44:20.992 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:44:20 compute-0 nova_compute[186479]: 2026-02-17 17:44:20.993 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:44:20 compute-0 nova_compute[186479]: 2026-02-17 17:44:20.993 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:44:20 compute-0 nova_compute[186479]: 2026-02-17 17:44:20.997 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:44:20 compute-0 nova_compute[186479]: 2026-02-17 17:44:20.998 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:44:21 compute-0 nova_compute[186479]: 2026-02-17 17:44:21.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:44:21 compute-0 nova_compute[186479]: 2026-02-17 17:44:21.303 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:44:21 compute-0 nova_compute[186479]: 2026-02-17 17:44:21.303 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 17 17:44:21 compute-0 nova_compute[186479]: 2026-02-17 17:44:21.421 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 17 17:44:25 compute-0 nova_compute[186479]: 2026-02-17 17:44:25.998 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:44:28 compute-0 podman[225788]: 2026-02-17 17:44:28.779756681 +0000 UTC m=+0.125605641 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127)
Feb 17 17:44:31 compute-0 nova_compute[186479]: 2026-02-17 17:44:31.001 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:44:33 compute-0 podman[225814]: 2026-02-17 17:44:33.73298024 +0000 UTC m=+0.069371889 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 17 17:44:36 compute-0 nova_compute[186479]: 2026-02-17 17:44:36.003 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:44:36 compute-0 nova_compute[186479]: 2026-02-17 17:44:36.004 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:44:36 compute-0 nova_compute[186479]: 2026-02-17 17:44:36.005 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:44:36 compute-0 nova_compute[186479]: 2026-02-17 17:44:36.005 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:44:36 compute-0 nova_compute[186479]: 2026-02-17 17:44:36.040 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:44:36 compute-0 nova_compute[186479]: 2026-02-17 17:44:36.040 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:44:37 compute-0 podman[225838]: 2026-02-17 17:44:37.737913083 +0000 UTC m=+0.082552389 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, vendor=Red Hat, Inc., config_id=openstack_network_exporter, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1770267347, io.openshift.expose-services=, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container)
Feb 17 17:44:41 compute-0 nova_compute[186479]: 2026-02-17 17:44:41.041 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:44:41 compute-0 nova_compute[186479]: 2026-02-17 17:44:41.042 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:44:41 compute-0 nova_compute[186479]: 2026-02-17 17:44:41.042 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:44:41 compute-0 nova_compute[186479]: 2026-02-17 17:44:41.042 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:44:41 compute-0 nova_compute[186479]: 2026-02-17 17:44:41.043 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:44:41 compute-0 nova_compute[186479]: 2026-02-17 17:44:41.044 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:44:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:44:45 compute-0 podman[225859]: 2026-02-17 17:44:45.693876822 +0000 UTC m=+0.039787784 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 17 17:44:45 compute-0 podman[225860]: 2026-02-17 17:44:45.724016741 +0000 UTC m=+0.066713335 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 17 17:44:46 compute-0 nova_compute[186479]: 2026-02-17 17:44:46.044 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:44:46 compute-0 nova_compute[186479]: 2026-02-17 17:44:46.046 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:44:46 compute-0 nova_compute[186479]: 2026-02-17 17:44:46.046 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:44:46 compute-0 nova_compute[186479]: 2026-02-17 17:44:46.046 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:44:46 compute-0 nova_compute[186479]: 2026-02-17 17:44:46.081 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:44:46 compute-0 nova_compute[186479]: 2026-02-17 17:44:46.081 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:44:51 compute-0 nova_compute[186479]: 2026-02-17 17:44:51.082 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:44:51 compute-0 podman[225896]: 2026-02-17 17:44:51.733988261 +0000 UTC m=+0.076667267 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 17 17:44:53 compute-0 nova_compute[186479]: 2026-02-17 17:44:53.101 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:44:56 compute-0 nova_compute[186479]: 2026-02-17 17:44:56.083 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:44:59 compute-0 podman[225920]: 2026-02-17 17:44:59.753735109 +0000 UTC m=+0.095888710 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 17 17:45:01 compute-0 nova_compute[186479]: 2026-02-17 17:45:01.085 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:45:04 compute-0 podman[225947]: 2026-02-17 17:45:04.724018063 +0000 UTC m=+0.071326997 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 17 17:45:06 compute-0 nova_compute[186479]: 2026-02-17 17:45:06.085 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:45:06 compute-0 nova_compute[186479]: 2026-02-17 17:45:06.087 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:45:08 compute-0 podman[225973]: 2026-02-17 17:45:08.706817052 +0000 UTC m=+0.050259787 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, release=1770267347, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Feb 17 17:45:09 compute-0 nova_compute[186479]: 2026-02-17 17:45:09.328 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:45:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:45:10.966 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:45:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:45:10.967 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:45:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:45:10.967 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:45:11 compute-0 nova_compute[186479]: 2026-02-17 17:45:11.087 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:45:11 compute-0 nova_compute[186479]: 2026-02-17 17:45:11.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:45:11 compute-0 nova_compute[186479]: 2026-02-17 17:45:11.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:45:11 compute-0 nova_compute[186479]: 2026-02-17 17:45:11.334 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:45:11 compute-0 nova_compute[186479]: 2026-02-17 17:45:11.335 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:45:11 compute-0 nova_compute[186479]: 2026-02-17 17:45:11.335 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:45:11 compute-0 nova_compute[186479]: 2026-02-17 17:45:11.335 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:45:11 compute-0 nova_compute[186479]: 2026-02-17 17:45:11.462 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:45:11 compute-0 nova_compute[186479]: 2026-02-17 17:45:11.463 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5752MB free_disk=73.20578002929688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:45:11 compute-0 nova_compute[186479]: 2026-02-17 17:45:11.463 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:45:11 compute-0 nova_compute[186479]: 2026-02-17 17:45:11.464 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:45:11 compute-0 nova_compute[186479]: 2026-02-17 17:45:11.581 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:45:11 compute-0 nova_compute[186479]: 2026-02-17 17:45:11.581 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:45:11 compute-0 nova_compute[186479]: 2026-02-17 17:45:11.595 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Refreshing inventories for resource provider c9b7a021-c13f-4158-9f46-47cefef2fece _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 17 17:45:11 compute-0 nova_compute[186479]: 2026-02-17 17:45:11.655 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Updating ProviderTree inventory for provider c9b7a021-c13f-4158-9f46-47cefef2fece from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 17 17:45:11 compute-0 nova_compute[186479]: 2026-02-17 17:45:11.656 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Updating inventory in ProviderTree for provider c9b7a021-c13f-4158-9f46-47cefef2fece with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 17 17:45:11 compute-0 nova_compute[186479]: 2026-02-17 17:45:11.667 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Refreshing aggregate associations for resource provider c9b7a021-c13f-4158-9f46-47cefef2fece, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 17 17:45:11 compute-0 nova_compute[186479]: 2026-02-17 17:45:11.693 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Refreshing trait associations for resource provider c9b7a021-c13f-4158-9f46-47cefef2fece, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SHA,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_BMI,HW_CPU_X86_ABM,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_BMI2,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NODE,HW_CPU_X86_SVM,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_CLMUL,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE,HW_CPU_X86_AVX,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_FMA3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_RESCUE_BFV,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE42,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_ACCELERATORS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 17 17:45:11 compute-0 nova_compute[186479]: 2026-02-17 17:45:11.709 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:45:11 compute-0 nova_compute[186479]: 2026-02-17 17:45:11.724 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:45:11 compute-0 nova_compute[186479]: 2026-02-17 17:45:11.725 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:45:11 compute-0 nova_compute[186479]: 2026-02-17 17:45:11.725 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:45:13 compute-0 nova_compute[186479]: 2026-02-17 17:45:13.720 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:45:13 compute-0 nova_compute[186479]: 2026-02-17 17:45:13.720 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:45:15 compute-0 nova_compute[186479]: 2026-02-17 17:45:15.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:45:16 compute-0 nova_compute[186479]: 2026-02-17 17:45:16.088 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:45:16 compute-0 nova_compute[186479]: 2026-02-17 17:45:16.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:45:16 compute-0 podman[225996]: 2026-02-17 17:45:16.710225995 +0000 UTC m=+0.051651679 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 17 17:45:16 compute-0 podman[225997]: 2026-02-17 17:45:16.714748025 +0000 UTC m=+0.057590994 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 17 17:45:17 compute-0 nova_compute[186479]: 2026-02-17 17:45:17.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:45:17 compute-0 nova_compute[186479]: 2026-02-17 17:45:17.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:45:21 compute-0 nova_compute[186479]: 2026-02-17 17:45:21.089 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:45:22 compute-0 nova_compute[186479]: 2026-02-17 17:45:22.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:45:22 compute-0 nova_compute[186479]: 2026-02-17 17:45:22.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:45:22 compute-0 nova_compute[186479]: 2026-02-17 17:45:22.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 17 17:45:22 compute-0 nova_compute[186479]: 2026-02-17 17:45:22.318 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 17 17:45:22 compute-0 podman[226036]: 2026-02-17 17:45:22.708848831 +0000 UTC m=+0.044009507 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 17 17:45:26 compute-0 nova_compute[186479]: 2026-02-17 17:45:26.091 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:45:30 compute-0 podman[226060]: 2026-02-17 17:45:30.775190257 +0000 UTC m=+0.108847014 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Feb 17 17:45:31 compute-0 nova_compute[186479]: 2026-02-17 17:45:31.092 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:45:31 compute-0 nova_compute[186479]: 2026-02-17 17:45:31.093 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:45:35 compute-0 podman[226086]: 2026-02-17 17:45:35.69627731 +0000 UTC m=+0.041797302 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 17 17:45:36 compute-0 nova_compute[186479]: 2026-02-17 17:45:36.094 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:45:36 compute-0 nova_compute[186479]: 2026-02-17 17:45:36.096 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:45:36 compute-0 nova_compute[186479]: 2026-02-17 17:45:36.096 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:45:36 compute-0 nova_compute[186479]: 2026-02-17 17:45:36.096 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:45:36 compute-0 nova_compute[186479]: 2026-02-17 17:45:36.124 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:45:36 compute-0 nova_compute[186479]: 2026-02-17 17:45:36.124 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:45:39 compute-0 podman[226110]: 2026-02-17 17:45:39.699044662 +0000 UTC m=+0.045959202 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, release=1770267347, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 17 17:45:41 compute-0 nova_compute[186479]: 2026-02-17 17:45:41.125 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:45:46 compute-0 nova_compute[186479]: 2026-02-17 17:45:46.126 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:45:47 compute-0 podman[226133]: 2026-02-17 17:45:47.736191731 +0000 UTC m=+0.065747991 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute)
Feb 17 17:45:47 compute-0 podman[226132]: 2026-02-17 17:45:47.737132093 +0000 UTC m=+0.075510687 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Feb 17 17:45:50 compute-0 sshd-session[226171]: Invalid user admin from 45.148.10.121 port 52994
Feb 17 17:45:50 compute-0 sshd-session[226171]: Connection closed by invalid user admin 45.148.10.121 port 52994 [preauth]
Feb 17 17:45:51 compute-0 nova_compute[186479]: 2026-02-17 17:45:51.128 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:45:53 compute-0 podman[226173]: 2026-02-17 17:45:53.737704836 +0000 UTC m=+0.066973542 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 17 17:45:56 compute-0 nova_compute[186479]: 2026-02-17 17:45:56.130 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:46:01 compute-0 nova_compute[186479]: 2026-02-17 17:46:01.132 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:46:01 compute-0 nova_compute[186479]: 2026-02-17 17:46:01.134 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:46:01 compute-0 nova_compute[186479]: 2026-02-17 17:46:01.134 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:46:01 compute-0 nova_compute[186479]: 2026-02-17 17:46:01.134 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:46:01 compute-0 nova_compute[186479]: 2026-02-17 17:46:01.134 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:46:01 compute-0 anacron[31434]: Job `cron.daily' started
Feb 17 17:46:01 compute-0 anacron[31434]: Job `cron.daily' terminated
Feb 17 17:46:01 compute-0 podman[226201]: 2026-02-17 17:46:01.744536933 +0000 UTC m=+0.082619520 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Feb 17 17:46:06 compute-0 nova_compute[186479]: 2026-02-17 17:46:06.138 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:46:06 compute-0 podman[226227]: 2026-02-17 17:46:06.712821948 +0000 UTC m=+0.050437521 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 17 17:46:09 compute-0 nova_compute[186479]: 2026-02-17 17:46:09.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:46:10 compute-0 podman[226252]: 2026-02-17 17:46:10.725356405 +0000 UTC m=+0.058651880 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z)
Feb 17 17:46:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:46:10.968 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:46:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:46:10.969 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:46:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:46:10.969 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:46:11 compute-0 nova_compute[186479]: 2026-02-17 17:46:11.136 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:46:11 compute-0 nova_compute[186479]: 2026-02-17 17:46:11.140 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:46:11 compute-0 nova_compute[186479]: 2026-02-17 17:46:11.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:46:11 compute-0 nova_compute[186479]: 2026-02-17 17:46:11.378 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:46:11 compute-0 nova_compute[186479]: 2026-02-17 17:46:11.378 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:46:11 compute-0 nova_compute[186479]: 2026-02-17 17:46:11.378 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:46:11 compute-0 nova_compute[186479]: 2026-02-17 17:46:11.379 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:46:11 compute-0 nova_compute[186479]: 2026-02-17 17:46:11.509 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:46:11 compute-0 nova_compute[186479]: 2026-02-17 17:46:11.511 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5742MB free_disk=73.20577239990234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:46:11 compute-0 nova_compute[186479]: 2026-02-17 17:46:11.511 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:46:11 compute-0 nova_compute[186479]: 2026-02-17 17:46:11.512 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:46:11 compute-0 nova_compute[186479]: 2026-02-17 17:46:11.616 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:46:11 compute-0 nova_compute[186479]: 2026-02-17 17:46:11.616 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:46:11 compute-0 nova_compute[186479]: 2026-02-17 17:46:11.635 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:46:11 compute-0 nova_compute[186479]: 2026-02-17 17:46:11.663 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:46:11 compute-0 nova_compute[186479]: 2026-02-17 17:46:11.664 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:46:11 compute-0 nova_compute[186479]: 2026-02-17 17:46:11.665 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:46:12 compute-0 nova_compute[186479]: 2026-02-17 17:46:12.665 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:46:15 compute-0 nova_compute[186479]: 2026-02-17 17:46:15.299 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:46:15 compute-0 nova_compute[186479]: 2026-02-17 17:46:15.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:46:15 compute-0 nova_compute[186479]: 2026-02-17 17:46:15.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:46:16 compute-0 nova_compute[186479]: 2026-02-17 17:46:16.139 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:46:16 compute-0 nova_compute[186479]: 2026-02-17 17:46:16.141 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:46:17 compute-0 nova_compute[186479]: 2026-02-17 17:46:17.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:46:17 compute-0 nova_compute[186479]: 2026-02-17 17:46:17.305 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:46:17 compute-0 nova_compute[186479]: 2026-02-17 17:46:17.305 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:46:18 compute-0 podman[226274]: 2026-02-17 17:46:18.72294756 +0000 UTC m=+0.057451591 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:46:18 compute-0 podman[226275]: 2026-02-17 17:46:18.729225271 +0000 UTC m=+0.061357835 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:46:20 compute-0 nova_compute[186479]: 2026-02-17 17:46:20.298 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:46:21 compute-0 nova_compute[186479]: 2026-02-17 17:46:21.142 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:46:24 compute-0 nova_compute[186479]: 2026-02-17 17:46:24.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:46:24 compute-0 nova_compute[186479]: 2026-02-17 17:46:24.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:46:24 compute-0 nova_compute[186479]: 2026-02-17 17:46:24.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 17 17:46:24 compute-0 nova_compute[186479]: 2026-02-17 17:46:24.322 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 17 17:46:24 compute-0 podman[226309]: 2026-02-17 17:46:24.705963756 +0000 UTC m=+0.047878689 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 17 17:46:26 compute-0 nova_compute[186479]: 2026-02-17 17:46:26.145 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:46:31 compute-0 nova_compute[186479]: 2026-02-17 17:46:31.147 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:46:32 compute-0 podman[226334]: 2026-02-17 17:46:32.742955982 +0000 UTC m=+0.082766574 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 17 17:46:36 compute-0 nova_compute[186479]: 2026-02-17 17:46:36.148 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:46:36 compute-0 nova_compute[186479]: 2026-02-17 17:46:36.150 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:46:37 compute-0 podman[226361]: 2026-02-17 17:46:37.720162905 +0000 UTC m=+0.054220543 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 17 17:46:41 compute-0 nova_compute[186479]: 2026-02-17 17:46:41.150 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:46:41 compute-0 nova_compute[186479]: 2026-02-17 17:46:41.152 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:46:41 compute-0 podman[226386]: 2026-02-17 17:46:41.76690409 +0000 UTC m=+0.084188487 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, version=9.7, release=1770267347)
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.720 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.721 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:43 compute-0 ceilometer_agent_compute[196205]: 2026-02-17 17:46:43.722 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 17 17:46:46 compute-0 nova_compute[186479]: 2026-02-17 17:46:46.152 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:46:46 compute-0 nova_compute[186479]: 2026-02-17 17:46:46.155 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:46:49 compute-0 podman[226408]: 2026-02-17 17:46:49.722915906 +0000 UTC m=+0.056591589 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 17 17:46:49 compute-0 podman[226407]: 2026-02-17 17:46:49.74211122 +0000 UTC m=+0.082698361 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 17 17:46:51 compute-0 nova_compute[186479]: 2026-02-17 17:46:51.155 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:46:55 compute-0 podman[226444]: 2026-02-17 17:46:55.698068484 +0000 UTC m=+0.040957782 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 17 17:46:56 compute-0 nova_compute[186479]: 2026-02-17 17:46:56.156 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:47:01 compute-0 nova_compute[186479]: 2026-02-17 17:47:01.159 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:47:03 compute-0 podman[226468]: 2026-02-17 17:47:03.736629827 +0000 UTC m=+0.082077916 container health_status 96afa0cba0123c4772f87c90d515eeb16f63434de091d6d2da6775ebfca95e75 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 17 17:47:06 compute-0 nova_compute[186479]: 2026-02-17 17:47:06.161 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:47:06 compute-0 nova_compute[186479]: 2026-02-17 17:47:06.163 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:47:06 compute-0 nova_compute[186479]: 2026-02-17 17:47:06.164 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:47:06 compute-0 nova_compute[186479]: 2026-02-17 17:47:06.164 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:47:06 compute-0 nova_compute[186479]: 2026-02-17 17:47:06.190 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:47:06 compute-0 nova_compute[186479]: 2026-02-17 17:47:06.191 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:47:08 compute-0 podman[226494]: 2026-02-17 17:47:08.707921635 +0000 UTC m=+0.047387368 container health_status 5648dfa43f376aa2796347ac08606a10ed5775f3ec046a6223e981d4aa18fb12 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 17 17:47:10 compute-0 sshd-session[226518]: Received disconnect from 45.148.10.157 port 18778:11:  [preauth]
Feb 17 17:47:10 compute-0 sshd-session[226518]: Disconnected from authenticating user root 45.148.10.157 port 18778 [preauth]
Feb 17 17:47:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:47:10.971 105898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:47:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:47:10.971 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:47:10 compute-0 ovn_metadata_agent[105893]: 2026-02-17 17:47:10.971 105898 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:47:11 compute-0 nova_compute[186479]: 2026-02-17 17:47:11.192 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:47:11 compute-0 nova_compute[186479]: 2026-02-17 17:47:11.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:47:11 compute-0 nova_compute[186479]: 2026-02-17 17:47:11.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:47:11 compute-0 nova_compute[186479]: 2026-02-17 17:47:11.332 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:47:11 compute-0 nova_compute[186479]: 2026-02-17 17:47:11.333 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:47:11 compute-0 nova_compute[186479]: 2026-02-17 17:47:11.333 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:47:11 compute-0 nova_compute[186479]: 2026-02-17 17:47:11.333 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 17 17:47:11 compute-0 nova_compute[186479]: 2026-02-17 17:47:11.471 186483 WARNING nova.virt.libvirt.driver [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 17 17:47:11 compute-0 nova_compute[186479]: 2026-02-17 17:47:11.473 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5742MB free_disk=73.20577239990234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 17 17:47:11 compute-0 nova_compute[186479]: 2026-02-17 17:47:11.473 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 17 17:47:11 compute-0 nova_compute[186479]: 2026-02-17 17:47:11.473 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 17 17:47:11 compute-0 nova_compute[186479]: 2026-02-17 17:47:11.562 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 17 17:47:11 compute-0 nova_compute[186479]: 2026-02-17 17:47:11.562 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 17 17:47:11 compute-0 nova_compute[186479]: 2026-02-17 17:47:11.590 186483 DEBUG nova.compute.provider_tree [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed in ProviderTree for provider: c9b7a021-c13f-4158-9f46-47cefef2fece update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 17 17:47:11 compute-0 nova_compute[186479]: 2026-02-17 17:47:11.609 186483 DEBUG nova.scheduler.client.report [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Inventory has not changed for provider c9b7a021-c13f-4158-9f46-47cefef2fece based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 17 17:47:11 compute-0 nova_compute[186479]: 2026-02-17 17:47:11.611 186483 DEBUG nova.compute.resource_tracker [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 17 17:47:11 compute-0 nova_compute[186479]: 2026-02-17 17:47:11.611 186483 DEBUG oslo_concurrency.lockutils [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 17 17:47:12 compute-0 nova_compute[186479]: 2026-02-17 17:47:12.611 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:47:12 compute-0 podman[226520]: 2026-02-17 17:47:12.732259019 +0000 UTC m=+0.076536613 container health_status 932294d2c969c8ef9ab47f7aad9c8b58ba578067d4294251bec68c047ecba61a (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.7, com.redhat.component=ubi9-minimal-container, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 17 17:47:15 compute-0 nova_compute[186479]: 2026-02-17 17:47:15.298 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:47:15 compute-0 nova_compute[186479]: 2026-02-17 17:47:15.302 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:47:16 compute-0 nova_compute[186479]: 2026-02-17 17:47:16.193 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:47:17 compute-0 nova_compute[186479]: 2026-02-17 17:47:17.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:47:17 compute-0 sshd-session[226545]: Received disconnect from 91.224.92.54 port 38550:11:  [preauth]
Feb 17 17:47:17 compute-0 sshd-session[226545]: Disconnected from authenticating user root 91.224.92.54 port 38550 [preauth]
Feb 17 17:47:18 compute-0 nova_compute[186479]: 2026-02-17 17:47:18.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:47:18 compute-0 nova_compute[186479]: 2026-02-17 17:47:18.304 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 17 17:47:19 compute-0 nova_compute[186479]: 2026-02-17 17:47:19.303 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:47:20 compute-0 podman[226547]: 2026-02-17 17:47:20.720646649 +0000 UTC m=+0.047078379 container health_status 2cad748d19729265bd362d508a9e095aafe8c62a0e94073ad231ddc1cf853998 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent)
Feb 17 17:47:20 compute-0 podman[226548]: 2026-02-17 17:47:20.732773673 +0000 UTC m=+0.050552194 container health_status f34369d14e317388860f3f814ae116989bbd235bc03143388feefba8e901e8b1 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6af267490ddb265931920bcf4d0eb649f15a68586e977301664b17a99008def5-512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 17 17:47:21 compute-0 sshd-session[226588]: Accepted publickey for zuul from 192.168.122.10 port 41364 ssh2: ECDSA SHA256:+U7vSwQZvFvLlKGEfHWMlLBUV1LwgLTIxJVbiTi7h3A
Feb 17 17:47:21 compute-0 nova_compute[186479]: 2026-02-17 17:47:21.195 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:47:21 compute-0 systemd-logind[806]: New session 29 of user zuul.
Feb 17 17:47:21 compute-0 systemd[1]: Started Session 29 of User zuul.
Feb 17 17:47:21 compute-0 sshd-session[226588]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 17 17:47:21 compute-0 sudo[226592]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Feb 17 17:47:21 compute-0 sudo[226592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 17 17:47:25 compute-0 ovs-vsctl[226762]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Feb 17 17:47:25 compute-0 nova_compute[186479]: 2026-02-17 17:47:25.304 186483 DEBUG oslo_service.periodic_task [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 17 17:47:25 compute-0 nova_compute[186479]: 2026-02-17 17:47:25.305 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 17 17:47:25 compute-0 nova_compute[186479]: 2026-02-17 17:47:25.305 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 17 17:47:25 compute-0 nova_compute[186479]: 2026-02-17 17:47:25.320 186483 DEBUG nova.compute.manager [None req-3322d76e-423e-4cb8-95df-1d820cbe4811 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 17 17:47:25 compute-0 virtqemud[185833]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Feb 17 17:47:26 compute-0 virtqemud[185833]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Feb 17 17:47:26 compute-0 virtqemud[185833]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb 17 17:47:26 compute-0 nova_compute[186479]: 2026-02-17 17:47:26.198 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:47:26 compute-0 nova_compute[186479]: 2026-02-17 17:47:26.201 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 17 17:47:26 compute-0 nova_compute[186479]: 2026-02-17 17:47:26.201 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 17 17:47:26 compute-0 nova_compute[186479]: 2026-02-17 17:47:26.201 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:47:26 compute-0 nova_compute[186479]: 2026-02-17 17:47:26.235 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 17 17:47:26 compute-0 nova_compute[186479]: 2026-02-17 17:47:26.236 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 17 17:47:26 compute-0 podman[226961]: 2026-02-17 17:47:26.408933399 +0000 UTC m=+0.060235538 container health_status 9bfa56419625b16fd7eaeeca7a40a0d97c7783cb3c31583860a1c3eb33cd44f6 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '512876873c81edb4a347766e247c84756d0e00571a3eec1459fd81a92cb54361-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 17 17:47:27 compute-0 crontab[227190]: (root) LIST (root)
Feb 17 17:47:28 compute-0 systemd[1]: Starting Hostname Service...
Feb 17 17:47:28 compute-0 systemd[1]: Started Hostname Service.
Feb 17 17:47:31 compute-0 nova_compute[186479]: 2026-02-17 17:47:31.236 186483 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
